Fortgeschrittenes Tuning mit mlr

Hyperparameter-Tuning in R

Dr. Shirin Elsinghorst

Data Scientist

Erweiterte Tuning-Controls

  • makeTuneControlCMAES: CMA-Evolutionsstrategie
  • makeTuneControlDesign: Vordefinierter Data Frame mit Hyperparametern
  • makeTuneControlGenSA: Generalisiertes Simulated Annealing
  • makeTuneControlIrace: Tuning mit iterated F-Racing
  • makeTuneControlMBO: Modellbasiertes/Bayes’sches Optimieren
Hyperparameter-Tuning in R
# Generalized simulated annealing
ctrl_gensa <- makeTuneControlGenSA()
# Holdout-Sampling erstellen
bootstrap <- makeResampleDesc("Bootstrap", predict = "both")

# Tuning ausführen lrn_tune <- tuneParams(learner = lrn, task = task, resampling = bootstrap, control = ctrl_gensa, par.set = param_set, measures = list(acc, mmce))
[Tune-x] 2170: eta=0.0771; max_depth=4
[Tune-y] 2170: acc.test.mean=0.9317275,mmce.test.mean=0.0682725; time: 0.0 m
[Tune-x] 2171: eta=0.822; max_depth=8
[Tune-y] 2171: acc.test.mean=0.9276912,mmce.test.mean=0.0723088; time: 0.0 m
[Tune-x] 2172: eta=0.498; max_depth=4
[Tune-y] 2172: acc.test.mean=0.9311626,mmce.test.mean=0.0688374; time: 0.0 m
[Tune-x] 2173: eta=0.365; max_depth=4
[Tune-y] 2173: acc.test.mean=0.9288406,mmce.test.mean=0.0711594; time: 0.0 m
Hyperparameter-Tuning in R
# Holdout-Sampling erstellen
bootstrap <- makeResampleDesc("Bootstrap", predict = "both")
# Tuning ausführen
lrn_tune <- tuneParams(learner = lrn, 
                       task = task, 
                       resampling = bootstrap, 
                       control = ctrl_gensa, 
                       par.set = param_set,
                       measures = list(acc, setAggregation(acc, train.mean), 
                                       mmce, setAggregation(mmce, train.mean)))
[Tune-x] 3920: eta=0.294; max_depth=8
[Tune-y] 3920: acc.test.mean=0.9250118,
               acc.train.mean=0.9740000,
               mmce.test.mean=0.0749882,
               mmce.train.mean=0.0260000; 
               time: 0.0 min
Hyperparameter-Tuning in R

Verschachtelte Cross-Validation & Resampling

lrn_wrapper <- makeTuneWrapper(learner = lrn, 
                               resampling = bootstrap, 
                               control = ctrl_gensa, 
                               par.set = param_set,
                               measures = list(acc, mmce))
  • Entweder direkt trainieren
model_nested <-  train(lrn_wrapper, task)
getTuneResult(model_nested)
  • Oder 2× verschachtelte Cross-Validation
cv2 <- makeResampleDesc("CV", iters = 2)
res <- resample(lrn_wrapper, task, 
                resampling = cv2, 
                extract = getTuneResult)
generateHyperParsEffectData(res)
Hyperparameter-Tuning in R

Wähle Hyperparameter aus einem Tuning-Set

lrn_best <- setHyperPars(lrn, par.vals = list(minsplit = 4, 
                                              minbucket = 3, 
                                              maxdepth = 6))

model_best <- train(lrn_best, task)
predict(model_best, newdata = knowledge_test_data)
Prediction: 30 observations
predict.type: response
threshold: 
time: 0.00
  truth response
1  High     High
2  High     High
3  High     High
4  High     High
5  High     High
6  High     High
... (#rows: 30, #cols: 2)
Hyperparameter-Tuning in R

Du bist dran!

Hyperparameter-Tuning in R

Preparing Video For Download...