Custom Classifier on Top of Bert-like Language Model
Take pre-trained language model and build custom classifier on top of it.
bert language-modeling pytorch pytorch-lightning sentiment-analysis transformers natural-language-processing polberta attention tutorial article code

  • Taking existing pre-trained language model and understanding it’s output - here I use PolBERTa trained for Polish language.
  • Building custom classification head on top of the LM.
  • Using fast tokenizers to efficiently tokenize and pad input text as well as prepare attention masks.
  • Preparing reproducible training code with PyTorch Lightning.
  • Finding good starting learning rate for the model.
  • Validating the trained model on PolEmo 2.0 dataset (benchmark for Polish language sentiment analysis with 4 classes).

Don't forget to tag @marrrcin in your comment, otherwise they may not be notified.

Authors
Data Engineer / Machine Learning Engineer @ Egnyte
Share this project
Similar projects
Transformers - Hugging Face
🤗 Transformers: State-of-the-art Natural Language Processing for TensorFlow 2.0 and PyTorch.
ELECTRA
Explaining the new self-supervised task for language representation learning, ELECTRA which uses "replace token detection".
Albert-base for Sanskrit
Trained Albert-base from scratch on Sanskrit corpus of Wikipedia. I have also added a link to how to train your own Language model from scratch.
The Illustrated BERT, ELMo, and co.
How NLP cracked transfer learning.
Top collections