Simple Transformers
Transformers for Classification, NER, QA, Language Modeling, Language Generation, T5, Multi-Modal, and Conversational AI.
transformers named-entity-recognition question-answering language-modeling language-generation t5 conversational-ai multi-modal natural-language-processing code library

This library is based on the Transformers library by HuggingFace. Simple Transformers lets you quickly train and evaluate Transformer models. Only 3 lines of code are needed to initialize a model, train the model, and evaluate a model.

Supports

  • Sequence Classification
  • Token Classification (NER)
  • Question Answering
  • Language Model Fine-Tuning
  • Language Model Training
  • Language Generation
  • T5 Model
  • Seq2Seq Tasks
  • Multi-Modal Classification
  • Conversational AI.
  • Text Representation Generation.

Don't forget to tag @ThilinaRajapakse in your comment, otherwise they may not be notified.

Authors community post
AI researcher, serial procrastinator, avid reader, fantasy and Sci-Fi geek, and fan of the Oxford comma.
Share this project
Similar projects
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
PyTorch Transformers Tutorials
A set of annotated Jupyter notebooks, that give user a template to fine-tune transformers model to downstream NLP tasks such as classification, NER etc.
BioBERT: a pre-trained biomedical language representation model
Code for fine-tuning BioBERT for biomedical text mining tasks such as biomedical NER, relation extraction, QA, etc.
MT-Clinical BERT
Scaling Clinical Information Extraction with Multitask Learning
Top collections