Generalized Language Models
Trend in large unsupervised pre-trained language models which have achieved amazing SOTA results on a variety of language tasks.
transformers attention bert elmo ulmfit albert gpt2 perplexity language-modeling natural-language-processing conversational-ai tutorial article

As a follow up of word embedding post, we will discuss the models on learning contextualized word vectors, as well as the new trend in large unsupervised pre-trained language models which have achieved amazing SOTA results on a variety of language tasks.

Don't forget to tag @lilianweng in your comment, otherwise they may not be notified.

Authors
Aloha!
Share this project
Similar projects
Anti-Patterns in NLP (8 types of NLP idiots)
A talk which discusses the recurring industrial problems in making NLP solutions.
BERT Loses Patience: Fast and Robust Inference with Early Exit
Patience-based Early Exit, a inference method that can be used as a plug-and-play technique to simultaneously improve the efficiency of a pretrained LM.
Contextualized Topic Models
A python package to run contextualized topic modeling.
SadedeGel: An extraction based Turkish news summarizer
"Sadede Gel" in Turkish, means "cut to the chase".
Top collections