Grid and random search with H2O

Hyperparameter Tuning in R

Dr. Shirin Elsinghorst

Senior Data Scientist

Hyperparameters in H2O models

  • Hyperparameters for Gradient Boosting:
?h2o.gbm
  • ntrees: Number of trees. Defaults to 50.

  • max_depth: Maximum tree depth. Defaults to 5.

  • min_rows: Fewest allowed (weighted) observations in a leaf. Defaults to 10.

  • learn_rate: Learning rate (from 0.0 to 1.0) Defaults to 0.1.

  • learn_rate_annealing: Scale the learning rate by this factor after each tree (e.g., 0.99 or 0.999) Defaults to 1.
Hyperparameter Tuning in R

Preparing our data for modeling with H2O

  • Convert to H2O frame
seeds_data_hf <- as.h2o(seeds_data)
  • Identify features and target
y <- "seed_type"
x <- setdiff(colnames(seeds_data_hf), y)
  • Split data into train, test & validation set
sframe <- h2o.splitFrame(data = seeds_data_hf, ratios = c(0.7, 0.15), seed = 42)
train <- sframe[[1]]
valid <- sframe[[2]]
test <- sframe[[3]]
Hyperparameter Tuning in R

Defining a hyperparameter grid

  • GBM hyperparamters
gbm_params <- list(ntrees = c(100, 150, 200), max_depth = c(3, 5, 7), learn_rate = c(0.001, 0.01, 0.1))
  • h2o.grid function
gbm_grid <- h2o.grid("gbm", 
                     grid_id = "gbm_grid",
                     x = x, 
                     y = y,
                     training_frame = train,
                     validation_frame = valid,
                     seed = 42,
                     hyper_params = gbm_params)
  • Examine results with h2o.getGrid
Hyperparameter Tuning in R

Examining a grid object

  • Examine results for our model gbm_grid with h2o.getGrid function.

  • Get the grid results sorted by validation accuracy

gbm_gridperf <- h2o.getGrid(grid_id = "gbm_grid", sort_by = "accuracy", decreasing = TRUE)
Grid ID: gbm_grid 
Used hyper parameters: 
  -  learn_rate 
  -  max_depth 
  -  ntrees 
Number of models: 27 
Number of failed models: 0 

Hyper-Parameter Search Summary: ordered by decreasing accuracy
Hyperparameter Tuning in R

Extracting the best model from a grid

  • Top GBM model chosen by validation accuracy has id position 1
best_gbm <- h2o.getModel(gbm_gridperf@model_ids[[1]])
  • These are the hyperparameters for the best model:
print(best_gbm@model[["model_summary"]])
Model Summary: 
 number_of_trees number_of_internal_trees model_size_in_bytes min_depth
             200                      600              100961         2 
 max_depth mean_depth min_leaves max_leaves mean_leaves
         7    5.22667          3         10     8.38833
Hyperparameter Tuning in R

Extracting the best model from a grid

  • best_gbm is a regular H2O model object and can be treated as such!
h2o.performance(best_gbm, test)
MSE: (Extract with `h2o.mse`) 0.04761904
RMSE: (Extract with `h2o.rmse`) 0.2182179
Logloss: (Extract with `h2o.loglos
Hyperparameter Tuning in R

Random search with H2O

  • In addition to hyperparameter grid, add search criteria:
gbm_params <- list(ntrees = c(100, 150, 200),
                   max_depth = c(3, 5, 7),
                   learn_rate = c(0.001, 0.01, 0.1))

search_criteria <- list(strategy = "RandomDiscrete", max_runtime_secs = 60, seed = 42)
gbm_grid <- h2o.grid("gbm", grid_id = "gbm_grid", x = x, y = y, training_frame = train, validation_frame = valid, seed = 42, hyper_params = gbm_params, search_criteria = search_criteria)
Hyperparameter Tuning in R
search_criteria <- list(strategy = "RandomDiscrete", 
                        stopping_metric = "mean_per_class_error", 
                        stopping_tolerance = 0.0001, 
                        stopping_rounds = 6)

gbm_grid <- h2o.grid("gbm", x = x, y = y, training_frame = train, validation_frame = valid, seed = 42, hyper_params = gbm_params, search_criteria = search_criteria)
H2O Grid Details
================
Grid ID: gbm_grid 
Used hyper parameters: 
  -  learn_rate 
  -  max_depth 
  -  ntrees 
Number of models: 30 
Number of failed models: 0
Hyperparameter Tuning in R

Time to practice!

Hyperparameter Tuning in R

Preparing Video For Download...