Summary of 🤗 Transformers Models
A high-level summary of the differences between each model in HuggingFace's Transformer library.
transformers huggingface summary natural-language-processing tutorial article
Links
Details

This is a summary of the models available in the transformers library. It assumes you’re familiar with the original transformer model. For a gentle introduction check the annotated transformer. Here we focus on the high-level differences between the models. You can check them more in detail in their respective documentation. Also checkout the pretrained model page to see the checkpoints available for each type of model.

Top collections

Don't forget to tag @huggingface , @sgugger in your comment.

Authors community post
Solving NLP, one commit at a time!
Research Engineer at HuggingFace
Share this project
Similar projects
Divide Hugging Face Transformers Training Time By 2
Reducing training time helps to iterate more in a fixed budget time and thus achieve better results.
Hugging Captions
Generate realistic instagram worthy captions using transformers given a hasthtag and a small text snippet.
RoBERTa meets TPUs
Understanding and applying the RoBERTa model to the current challenge.
FastHugs: Sequence Classification with Transformers and Fastai
Fine-tune a text classification model with HuggingFace 🤗 transformers and fastai-v2.