Wrap up

Hyperparameter Tuning in Python

Alex Scriven

Data Scientist

Hyperparameters vs Parameters

   

Hyperparameters vs Parameters:

Hyperparameters are components of the model that you set. They are not learned during the modeling process

Parameters are not set by you. The algorithm will discover these for you

Hyperparameter Tuning in Python

Which hyperparameters & values?

 

You learned:

  • Some hyperparameters are better to start with than others
  • There are silly values you can set for hyperparameters
  • You need to beware of conflicting hyperparameters
  • Best practice is specific to algorithms and their hyperparameters
Hyperparameter Tuning in Python

Remembering Grid Search

 

We introduced grid search:

  • Construct a matrix (or 'grid') of hyperparameter combinations and values
  • Build models for all the different hyperparameter combinations
  • Then pick the winner

A computationally expensive option but is guaranteed to find the best in your grid. (Remember the importance of setting a good grid!)

Hyperparameter Tuning in Python

Remembering Random Search

 

Random Search:

  • Very similar to grid search
  • Main difference is selecting (n) random combinations.

This method is faster at getting a reasonable model but will not get the best in your grid.

Hyperparameter Tuning in Python

From uninformed to informed search

Looking at informed search:

In informed search, each iteration learns from the last, whereas in Grid and Random, modeling is all done at once and then the best is picked.

Informed methods explored were:

  • 'Coarse to Fine' (Iterative random then grid search)
  • Bayesian hyperparameter tuning, updating beliefs using evidence on model performance
  • Genetic algorithms, evolving your models over generations.
Hyperparameter Tuning in Python

Thank you!

Hyperparameter Tuning in Python

Preparing Video For Download...