Top Down Introduction to BERT with HuggingFace and PyTorch
I will also provide some intuition into how BERT works with a top down approach (applications to algorithm).
bert top-down huggingface pytorch attention transformers natural-language-processing tutorial article

If you're just getting started with BERT, this article is for you. I will explain the most popular use cases, the inputs and outputs of the model, and how it was trained. I will also provide some intuition into how it works, and will refer your to several excellent guides if you'd like to get deeper.

Don't forget to tag @kldarek in your comment, otherwise they may not be notified.

Authors community post
Share this project
Similar projects
RoBERTa meets TPUs
Understanding and applying the RoBERTa model to the current challenge.
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
MT-Clinical BERT
Scaling Clinical Information Extraction with Multitask Learning
TaBERT
Pretraining for Joint Understanding of Textual and Tabular Data
Top collections