Machine learning with H2O

Hyperparameter Tuning in R

Dr. Shirin Elsinghorst

Senior Data Scientist

What is H2O?

library(h2o)
h2o.init()
H2O is not running yet, starting it now...
java version "1.8.0_351"
Java(TM) SE Runtime Environment (build 1.8.0_351-b10)
Java HotSpot(TM) 64-Bit Server VM (build 25.351-b10, mixed mode)
Starting H2O JVM and connecting: ... Connection successful!
R is connected to the H2O cluster: 
    H2O cluster uptime:         1 seconds 620 milliseconds 
    H2O cluster timezone:       UTC 
    H2O data parsing timezone:  UTC 
    H2O cluster version:        3.38.0.1 
    H2O cluster version age:    2 months and 25 days  
    H2O cluster name:           H2O_started_from_R_repl_chk886 
    H2O cluster total nodes:    1 
    H2O cluster total memory:   0.98 GB 
    H2O cluster total cores:    2 
    H2O cluster allowed cores:  2 
    H2O cluster healthy:        TRUE 
    H2O Connection ip:          localhost 
    H2O Connection port:        54321 
    H2O Connection proxy:       NA 
    H2O Internal Security:      FALSE 
    R Version:                  R version 4.2.1 (2022-06-23)
Hyperparameter Tuning in R

New dataset: seeds data

glimpse(seeds_data)
Observations: 150
Variables: 8
$ area          <dbl> 15.26, 14.88, 14.29, 13.84 ...
$ perimeter     <dbl> 14.84, 14.57, 14.09, 13.94 ...
$ compactness   <dbl> 0.8710, 0.8811, 0.9050 ...
$ kernel_length <dbl> 5.763, 5.554, 5.291, 5.324 ...
$ kernel_width  <dbl> 3.312, 3.333, 3.337, 3.379 ...
$ asymmetry     <dbl> 2.2210, 1.0180, 2.6990 ...
$ kernel_groove <dbl> 5.220, 4.956, 4.825, 4.805 ...
$ seed_type     <int> 1, 1, 1, 1, 1, 1, 1, 1, 1, ...
seeds_data %>%
  count(seed_type)
# A tibble: 3 x 2
  seed_type     n
      <int> <int>
1         1    50
2         2    50
3         3    50
Hyperparameter Tuning in R

Preparing the data for modeling with H2O

  • Data as H2O frame

    seeds_data_hf <- as.h2o(seeds_data)
    
  • Define features and target variable

    y <- "seed_type"
    x <- setdiff(colnames(seeds_data_hf), y)
    
  • For classification target should be a factor

    seeds_data_hf[, y] <- as.factor(seeds_data_hf[, y])
    
Hyperparameter Tuning in R

Training, validation and test sets

sframe <- h2o.splitFrame(data = seeds_data_hf, 
                         ratios = c(0.7, 0.15),
                         seed = 42)
train <- sframe[[1]]
valid <- sframe[[2]]
test <- sframe[[3]]
summary(train$seed_type, exact_quantiles = TRUE)
seed_type
 1:36     
 2:36     
 3:35
summary(test$seed_type, exact_quantiles = TRUE)
 seed_type
 1:8      
 2:8      
 3:5
Hyperparameter Tuning in R

Model training with H2O

  • Gradient boosted models with h2o.gbm() & h2o.xgboost()
  • Generalized linear models with h2o.glm()
  • Random forest models with h2o.randomForest()
  • Neural networks with h2o.deeplearning()
Hyperparameter Tuning in R

Model training with H2O

gbm_model <- h2o.gbm(x = x, y = y, 
                     training_frame = train, 
                     validation_frame = valid)
Model Details:
==============

H2OMultinomialModel: gbm
Model ID:  GBM_model_R_1540736041817_1 
Model Summary: 
number_of_trees number_of_internal_trees model_size_in_bytes min_depth 
             50                      150               24877         2 
max_depth mean_depth min_leaves max_leaves mean_leaves
        5    4.72000          3         10     8.26667
Hyperparameter Tuning in R
  • Model performance
perf <- h2o.performance(gbm_model, test)

h2o.confusionMatrix(perf)
Confusion Matrix: Row labels: Actual class; Column labels: Predicted class
       1 2 3  Error     Rate
1      7 0 1 0.1250 =  1 / 8
2      0 8 0 0.0000 =  0 / 8
3      0 0 5 0.0000 =  0 / 5
Totals 7 8 6 0.0476 = 1 / 21
h2o.logloss(perf)
0.2351779
  • Predict new data
h2o.predict(gbm_model, test)
Hyperparameter Tuning in R

Let's practice!

Hyperparameter Tuning in R

Preparing Video For Download...