Fine-tune a non-English GPT-2 Model with Huggingface
In this tutorial, we are going to use the transformers library by Huggingface. We will use the new Trainer class and fine-tune out GPT-2 model.
transformers fine-tuning huggingface gpt gpt-2 natural-language-processing tutorial code notebook article

In this tutorial, we are going to use the transformers library by Huggingface in their newest version (3.1.0). We will use the new Trainer class and fine-tune our GPT-2 Model with German recipes from chefkoch.de.

Don't forget to tag @philschmid in your comment, otherwise they may not be notified.

Authors community post
Share this project
Similar projects
All Models and checkpoints - Hugging Face
Massive (and growing) collection of NLP models are nearly any NLP tasks, especially those involving the use of transformers.
Fine-tuning with custom datasets
This tutorial will take you through several examples of using 🤗 Transformers models with your own datasets.
Movement Pruning: Adaptive Sparsity by Fine-Tuning
We propose the use of movement pruning, a simple, deterministic first-order weight pruning method that is more adaptive to pretrained model fine-tuning.
BERT NLP — How To Build a Question Answering Bot
Understanding the intuition with hands-on PyTorch code for BERT fine-tuned on SQuAD.
Top collections