Top Down Introduction to BERT with HuggingFace and PyTorch
I will also provide some intuition into how BERT works with a top down approach (applications to algorithm).
bert top-down huggingface pytorch attention transformers natural-language-processing tutorial article

If you're just getting started with BERT, this article is for you. I will explain the most popular use cases, the inputs and outputs of the model, and how it was trained. I will also provide some intuition into how it works, and will refer your to several excellent guides if you'd like to get deeper.

Don't forget to tag @kldarek in your comment, otherwise they may not be notified.

Authors community post
Share this project
Similar projects
MT-Clinical BERT
Scaling Clinical Information Extraction with Multitask Learning
CodeBERT - Masked Language Model for source code
Tutorial to use codeBERT a MLM for Python code. Model trained from scratch using roBERTa
Contextualized Topic Models
A python package to run contextualized topic modeling.
Cycle Text-To-Image GAN with BERT
Image generation from their respective captions, building on state-of-the-art GAN architectures.
Top collections