Projects

latest | popular

Filter by
VirTex: Learning Visual Representations from Textual Annotations
We train CNN+Transformer from scratch from COCO, transfer the CNN to 6 downstream vision tasks, and exceed ImageNet features despite using 10x fewer ...
convolutional-neural-networks transformers coco visual-representations
Finetuning Transformers with JAX + Haiku
Walking through a port of the RoBERTa pre-trained model to JAX + Haiku, then fine-tuning the model to solve a downstream task.
jax haiku roberta transformers
Finetune: Scikit-learn Style Model Finetuning for NLP
Finetune is a library that allows users to leverage state-of-the-art pretrained NLP models for a wide variety of downstream tasks.
natural-language-processing finetuning pretraining transformers
PyTorch Hub
Discover and publish models to a pre-trained model repository designed for research exploration.
pretraining fine-tuning models pytorch
MT-Clinical BERT
Scaling Clinical Information Extraction with Multitask Learning
health multi-task-learning information-extraction clinical-information-extraction
All Models and checkpoints - Hugging Face
Massive (and growing) collection of NLP models are nearly any NLP tasks, especially those involving the use of transformers.
pretraining fine-tuning natural-language-processing huggingface
projects 1 - 10 of 27
Topic experts
Share a project
Share something you or the community has made with ML.