Simple Transformers: Transformers Made Easy
Simple Transformers removes complexity and lets you get down to what matters – model training and experimenting with the Transformer model architectures.
transformers huggingface simple-transformers text-classification natural-language-processing article wandb notebook code tutorial

Simple Transformers, removes complexity and lets you get down to what matters - model training and experimenting with the Transformer model architectures. It helps you bypass all the complicated setups, boilerplate code, and all the other general unpleasantness by,

  • initializing a model in one line
  • training in the next
  • and evaluating in the third line.

Comparisons:

Don't forget to tag @ayulockin in your comment, otherwise they may not be notified.

Authors original post
Deep Learning for Computer Vision
Share this project
Similar projects
Tldrstory: AI-powered Understanding of Headlines and Story Text
A framework for AI-powered understanding of headlines and text content related to stories.
RoBERTa meets TPUs
Understanding and applying the RoBERTa model to the current challenge.
FastHugs: Sequence Classification with Transformers and Fastai
Fine-tune a text classification model with HuggingFace 🤗 transformers and fastai-v2.
Fine-tune a non-English GPT-2 Model with Huggingface
In this tutorial, we are going to use the transformers library by Huggingface. We will use the new Trainer class and fine-tune out GPT-2 model.
Top collections