Top Down Introduction to BERT with HuggingFace and PyTorch
I will also provide some intuition into how BERT works with a top down approach (applications to algorithm).
bert top-down huggingface pytorch attention transformers natural-language-processing tutorial
Resource links
Top collections
Details
Objectives & Highlights

If you're just getting started with BERT, this article is for you. I will explain the most popular use cases, the inputs and outputs of the model, and how it was trained. I will also provide some intuition into how it works, and will refer your to several excellent guides if you'd like to get deeper.

Don't forget to tag @kldarek in your comment.

Authors community post
Share this project
Similar projects
Generalized Language Models
Trend in large unsupervised pre-trained language models which have achieved amazing SOTA results on a variety of language tasks.
Visual Paper Summary: ALBERT(A Lite BERT)
An illustrated summary of ALBERT paper and how it improves BERT and makes it resource efficient
Cycle Text-To-Image GAN with BERT
Image generation from their respective captions, building on state-of-the-art GAN architectures.
Compressing Bert for Faster Prediction
In this blog post, we discuss ways to make huge models like BERT smaller and faster.