T5 fine-tuning
A colab notebook to showcase how to fine-tune T5 model on various NLP tasks (especially non text-2-text tasks with text-2-text approach)
natural-language-processing transformers text-2-text t5 code notebook paper arxiv:1910.10683 tutorial research

  • Demonstrate how to fine-tune T5 model.
  • Explore the text-2-text framework as proposed in the T5 paper to see how it performs on non text-2-text tasks by casting them in text-2-text settings.
  • Write a generic trainer that can be used for any problem which can be formulated as text-2-text. No need to change model, hyperparameters or add a task specific head. Just change the dataset and that's it!!

Don't forget to tag @patil-suraj in your comment, otherwise they may not be notified.

Authors original post
Flutter | Deep Learning | Python | Web
Share this project
Similar projects
Tensorflow, Pytorch, Transformer, Fastai, etc. Tutorials
BERT Classification, Question Answering, Seq2Seq Machine Translation, Contextual Topic Modeling, Large Scale Multilabelclassification, etc
Haystack — Neural Question Answering At Scale
🔍 Transformers at scale for question answering & search
NLP Model Selection
NLP model selection guide to make it easier to select models. This is prescriptive in nature and has to be used with caution.
Illustrated Guide to Transformers: Step by Step Explanation
In this post, we’ll focus on the one paper that started it all, “Attention is all you need”.
Top collections