TTT: Fine-tuning Transformers with TPUs or GPUs acceleration
TTT is short for a package for fine-tuning 🤗 Transformers with TPUs, written in Tensorflow2.0+.
natural-language-processing transformers tensorflow-tpus code notebook article tensorflow tpu gpu tutorial library research


  • Switch between TPUs and GPUs easily.
  • Stable training on TPUs.
  • Customize datasets or load from the nlp library.
  • Using pretrained tensorflow weights from the open-source library - 🤗 transformers.
  • Fine-tuning BERT-like transformers (DistilBert, ALBERT, Electra, RoBERTa) using keras High-level API.
  • Fine-tuning T5-like transformers using customize training loop, written in tensorflow.
  • So far, this package mainly supports single-sequence classificaton based tasks. However, it can be easily extended to support other language tasks.

Don't forget to tag @wangcongcong123 in your comment, otherwise they may not be notified.

Authors original post
Ph.D student@UCD, Crisis on Social Media, NLP, Machine Learning, IR
Share this project
Similar projects
The Transformer … “Explained”?
An intuitive explanation of the Transformer by motivating it through the lens of CNNs, RNNs, etc.
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
NLP Model Selection
NLP model selection guide to make it easier to select models. This is prescriptive in nature and has to be used with caution.
Illustrated Guide to Transformers
A component by component breakdown analysis.