How Hugging Face achieved a 2x performance boost for QA
Question Answering with DistilBERT in Node.js
question-answering distillbert huggingface node-js natural-language-processing tutorial article

We’re going to showcase one of the paths we believe can help fulfill this goal: the use of “small”, yet performant models (such as DistilBERT), and frameworks targeting ecosystems different from Python such as Node via TensorFlow.js.

Don't forget to tag @huggingface , @Pierrci in your comment, otherwise they may not be notified.

Authors community post
Solving NLP, one commit at a time!
Share this project
Similar projects
Haystack — Neural Question Answering At Scale
Scaling Question Answering models to find answers in large document stores via retriever and reader approach.
Unsupervised Toolbox
Unsupervised learning Tool box : A micro framework for State of the Art Methods and models for unsupervised learning for NLU / NLG
Tuned ALBERT (ensemble model)
Top 6 in Squad 2.0
Transfer Learning with T5: the Text-To-Text Transfer Transformer
In the paper, we demonstrate how to achieve state-of-the-art results on multiple NLP tasks using a text-to-text transformer pre-trained on a large text ...