WT5?! Training Text-to-Text Models to Explain their Predictions
We leverage the text-to-text framework proposed by Raffel et al.(2019) to train language models to output a natural text explanation alongside their ...
t5 transformers interpretability natural-language-processing text-to-text-transfer-transformer code paper arxiv:2004.14546 language-modeling tutorial research

Don't forget to tag @craffel , @google-research , @sharannarang , @katelee168 , @nfiedel , @adarob in your comment, otherwise they may not be notified.

Authors community post
Brain Resident at @Google
Share this project
Similar projects
Transfer Learning with T5: the Text-To-Text Transfer Transformer
In the paper, we demonstrate how to achieve state-of-the-art results on multiple NLP tasks using a text-to-text transformer pre-trained on a large text ...
Simple Transformers
Transformers for Classification, NER, QA, Language Modeling, Language Generation, T5, Multi-Modal, and Conversational AI.
Questgen- An NLP library for state-of-the-art Question Generation
Questgen AI is an opensource, easy to use NLP library for Question generation. It can generate MCQs, Boolean (Yes/No), FAQs and also paraphrase any ...
Paraphrase Any Question with T5 (Text-To-Text Transformer)
Given a question, generate paraphrased versions of the question with T5 transformer. Pretrained model and training script provided.
Top collections