The Future of (Transfer Learning in) Natural Language Processing
Transfer Learning in Natural Language Processing (NLP): Open questions, current trends, limits, and future directions.
natural-language-processing transfer-learning tutorial video

A walk through interesting papers and research directions in late 2019/early-2020 on: - model size and computational efficiency, - out-of-domain generalization and model evaluation, - fine-tuning and sample efficiency, - common sense and inductive biases.

Don't forget to tag @thomwolf in your comment, otherwise they may not be notified.

Authors community post
Science Lead @ Huggingface Inc.
Share this project
Similar projects
VirTex: Learning Visual Representations from Textual Annotations
We train CNN+Transformer from scratch from COCO, transfer the CNN to 6 downstream vision tasks, and exceed ImageNet features despite using 10x fewer ...
Transfer Learning In NLP
A brief history of Transfer Learning In NLP
🦄 How to build a SOTA Conversational AI with Transfer Learning
Train a dialog agent leveraging transfer Learning from an OpenAI GPT and GPT-2 Transformer language model.
DialoGPT: Toward Human-Quality Conversational Response Generation
Large-scale pre-training for dialogue.
Top collections