NER model for 40 languages trained with the new TFTrainer
This model is a fine-tuned XLM-Roberta-base over the 40 languages proposed in XTREME from Wikiann.
named-entity-recognition huggingface transformers tftrainer multilingual natural-language-processing code tensorflow bert attention language-modeling tutorial library

Reproducing the results: • Download and prepare the dataset from the https://github.com/google-research/xtreme#download-the-data.

Don't forget to tag @jplu in your comment, otherwise they may not be notified.

Authors community post
PhD in NLP and Deep Learning. Now applied scientist.
Share this project
Similar projects
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
Insight
Project Insight is designed to create NLP as a service with code base for both front end GUI (streamlit) and backend server (FastAPI) the usage of ...
PyTorch Transformers Tutorials
A set of annotated Jupyter notebooks, that give user a template to fine-tune transformers model to downstream NLP tasks such as classification, NER etc.
How to Train Your Neural Net
Deep learning for various tasks in the domains of Computer Vision, Natural Language Processing, Time Series Forecasting using PyTorch 1.0+.