latest | popular

Filter by
T5 fine-tuning
A colab notebook to showcase how to fine-tune T5 model on various NLP tasks (especially non text-2-text tasks with text-2-text approach)
natural-language-processing transformers text-2-text t5
Summarization, translation, Q&A, text generation and more at blazing speed using a T5 version implemented in ONNX.
onnx pytorch transformers t5
Transfer Learning with T5: the Text-To-Text Transfer Transformer
In the paper, we demonstrate how to achieve state-of-the-art results on multiple NLP tasks using a text-to-text transformer pre-trained on a large text ...
transformers t5 question-answering reading-comprehension
A Text to Text Approach for COVID-19 Event Extraction
Help answer covid-related questions that people are likely to post on social media via fine-tuning sequence-to-sequence transformers.
text-classification-as-question-answering covid-on-social-media t5 arxiv:2009.10047
WT5?! Training Text-to-Text Models to Explain their Predictions
We leverage the text-to-text framework proposed by Raffel et al.(2019) to train language models to output a natural text explanation alongside their ...
t5 transformers interpretability natural-language-processing
projects 1 - 5 of 5
Topic experts
Share a project
Share something you or the community has made with ML.