Projects

latest | popular

Finetuning Transformers with JAX + Haiku
Walking through a port of the RoBERTa pre-trained model to JAX + Haiku, then fine-tuning the model to solve a downstream task.
jax haiku roberta transformers
Finetune: Scikit-learn Style Model Finetuning for NLP
Finetune is a library that allows users to leverage state-of-the-art pretrained NLP models for a wide variety of downstream tasks.
natural-language-processing finetuning pretraining transformers
Retrieve: Automate wget-ting pre-trained models
No-frills library to download pre-trained models, cache it and return the local path.
pre-trained-models library python deep-learning
MT-Clinical BERT
Scaling Clinical Information Extraction with Multitask Learning
health multi-task-learning information-extraction clinical-information-extraction
BiT: Exploring Large-Scale Pre-training for Compute
We are excited to share the best BiT models pre-trained on public datasets, along with code in TF2, Jax, and PyTorch.
object-detection computer-vision pretraining models
All The Ways You Can Compress BERT
In this post I’ll list and briefly taxonomize all the papers I’ve seen compressing BERT.
bert pruning weight-factorization compression
The State of Transfer Learning in NLP
This post expands on the NAACL 2019 tutorial on Transfer Learning in NLP. It highlights key insights and takeaways and provides updates based on recent ...
transfer-learning natural-language-processing pretraining tutorial
Natural Language Processing: Pretraining - d2l
An interactive deep learning book with code, math, and discussions, based on the NumPy interface.
pretraining natural-language-processing bert mxnet
Data Augmentation | How to use Deep Learning With Limited Data
This article is a comprehensive review of Data Augmentation techniques for Deep Learning, specific to images.
data-augmentation pretraining tutorial
projects 1 - 9 of 9
Topic experts
Share a project
Share something interesting you found that's made with ML.