Advanced tuning with mlr

Hyperparameter Tuning in R

Dr. Shirin Elsinghorst

Data Scientist

Advanced tuning controls

  • makeTuneControlCMAES: CMA Evolution Strategy
  • makeTuneControlDesign: Predefined data frame of hyperparameters
  • makeTuneControlGenSA: Generalized simulated annealing
  • makeTuneControlIrace: Tuning with iterated F-Racing
  • makeTuneControlMBO: Model-based / Bayesian optimization
Hyperparameter Tuning in R
# Generalized simulated annealing
ctrl_gensa <- makeTuneControlGenSA()
# Create holdout sampling
bootstrap <- makeResampleDesc("Bootstrap", predict = "both")

# Perform tuning lrn_tune <- tuneParams(learner = lrn, task = task, resampling = bootstrap, control = ctrl_gensa, par.set = param_set, measures = list(acc, mmce))
[Tune-x] 2170: eta=0.0771; max_depth=4
[Tune-y] 2170: acc.test.mean=0.9317275,mmce.test.mean=0.0682725; time: 0.0 m
[Tune-x] 2171: eta=0.822; max_depth=8
[Tune-y] 2171: acc.test.mean=0.9276912,mmce.test.mean=0.0723088; time: 0.0 m
[Tune-x] 2172: eta=0.498; max_depth=4
[Tune-y] 2172: acc.test.mean=0.9311626,mmce.test.mean=0.0688374; time: 0.0 m
[Tune-x] 2173: eta=0.365; max_depth=4
[Tune-y] 2173: acc.test.mean=0.9288406,mmce.test.mean=0.0711594; time: 0.0 m
Hyperparameter Tuning in R
# Create holdout sampling
bootstrap <- makeResampleDesc("Bootstrap", predict = "both")
# Perform tuning
lrn_tune <- tuneParams(learner = lrn, 
                       task = task, 
                       resampling = bootstrap, 
                       control = ctrl_gensa, 
                       par.set = param_set,
                       measures = list(acc, setAggregation(acc, train.mean), 
                                       mmce, setAggregation(mmce, train.mean)))
[Tune-x] 3920: eta=0.294; max_depth=8
[Tune-y] 3920: acc.test.mean=0.9250118,
               acc.train.mean=0.9740000,
               mmce.test.mean=0.0749882,
               mmce.train.mean=0.0260000; 
               time: 0.0 min
Hyperparameter Tuning in R

Nested cross-validation & nested resampling

lrn_wrapper <- makeTuneWrapper(learner = lrn, 
                               resampling = bootstrap, 
                               control = ctrl_gensa, 
                               par.set = param_set,
                               measures = list(acc, mmce))
  • Either train directly
model_nested <-  train(lrn_wrapper, task)
getTuneResult(model_nested)
  • Or add 2x nested cross-validation
cv2 <- makeResampleDesc("CV", iters = 2)
res <- resample(lrn_wrapper, task, 
                resampling = cv2, 
                extract = getTuneResult)
generateHyperParsEffectData(res)
Hyperparameter Tuning in R

Choose hyperparameters from a tuning set

lrn_best <- setHyperPars(lrn, par.vals = list(minsplit = 4, 
                                              minbucket = 3, 
                                              maxdepth = 6))

model_best <- train(lrn_best, task)
predict(model_best, newdata = knowledge_test_data)
Prediction: 30 observations
predict.type: response
threshold: 
time: 0.00
  truth response
1  High     High
2  High     High
3  High     High
4  High     High
5  High     High
6  High     High
... (#rows: 30, #cols: 2)
Hyperparameter Tuning in R

It's your turn!

Hyperparameter Tuning in R

Preparing Video For Download...