Hyperparameter Optimization


Hyperparameter optimization or tuning is the problem of choosing a set of optimal hyperparameters for a learning algorithm. A hyperparameter is a parameter whose value is used to control the learning process. By contrast, the values of other parameters (typically node weights) are learned.

Overview

Tutorials

Automated Machine Learning Hyperparameter Tuning in Python
A complete walk through using Bayesian optimization for automated hyperparameter tuning in Python
hyperparameter-optimization bayesian-optimization tutorial article

Libraries

General
Ray
Ray is a fast and simple framework for building and running distributed applications.
hyperparameter-optimization reinforcement-learning scalable-reinforcement-learning hyperparameter-tuning
Hyperparameter Optimization for AllenNLP Using Optuna
🚀 A demonstration of hyperparameter optimization using Optuna for models implemented with AllenNLP.
hyperparameter-optimization optuna allennlp allenai
30x Faster Hyperparameter Search with Ray Tune and RAPIDS
We will show how to both increase the accuracy of our Random Forest Classifier by 5% AND reduce tuning time by 30x.
hyperparameter-optimization ray rapidsai tutorial
Table of Contents
Share a project
Share something you or the community has made with ML.
Topic experts
Share