TTT: Fine-tuning Transformers with TPUs or GPUs acceleration
TTT is short for a package for fine-tuning 🤗 Transformers with TPUs, written in Tensorflow2.0+.
Switch between TPUs and GPUs easily.
Stable training on TPUs.
Customize datasets or load from
the nlp library
Using pretrained tensorflow weights from the open-source library -
Fine-tuning BERT-like transformers (DistilBert, ALBERT, Electra, RoBERTa) using keras High-level API.
Fine-tuning T5-like transformers using customize training loop, written in tensorflow.
So far, this package mainly supports single-sequence classificaton based tasks. However, it can be easily extended to support other language tasks.
Don't forget to tag
in your comment, otherwise they may not be notified.
Ph.D student@UCD, Crisis on Social Media, NLP, Machine Learning, IR
Share this project
The Transformer … “Explained”?
An intuitive explanation of the Transformer by motivating it through the lens of CNNs, RNNs, etc.
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
NLP Model Selection
NLP model selection guide to make it easier to select models. This is prescriptive in nature and has to be used with caution.
Illustrated Guide to Transformers
A component by component breakdown analysis.
Share what you've made with ML.