FastHugs: Sequence Classification with Transformers and Fastai
Fine-tune a text classification model with HuggingFace 🤗 transformers and fastai-v2.
transformers fastai huggingface sequence-classification fasthugs natural-language-processing article code notebook tutorial library
Links
Details

  • FastHugsTokenizer: A tokenizer wrapper than can be used with fastai-v2's tokenizer.
  • FastHugsModel: A model wrapper over the HF models, more or less the same to the wrapper's from HF fastai-v1 articles mentioned below
  • Padding: Padding settings for the padding token index and on whether the transformer prefers left or right padding
  • Model Splitters: Functions to split the classification head from the model backbone in line with fastai-v2's new definition of Learner (in splitters.py

Top collections

Don't forget to tag @morganmcg1 in your comment.

Authors community post
Machine learning until I learn better, having fun along the way. Diving into machine translation at the moment.
Share this project
Similar projects
Summary of 🤗 Transformers Models
A high-level summary of the differences between each model in HuggingFace's Transformer library.
Illustrated Guide to Transformers
A component by component breakdown analysis.
Generate Boolean (Yes/No) Questions From Any Content
Question generation algorithm trained on the BoolQ dataset using T5 text-to-text transformer model.
Tips for Successfully Training Transformers on Small Datasets
It turns out that you can easily train transformers on small datasets when you use tricks (and have the patience to train a very long time).