Hyperparameter Tuning in R
Dr. Shirin Elsinghorst
Data Scientist
mlr
is another framework for machine learning in R.Model training follows three steps:
library(tidyverse)
glimpse(knowledge_data)
Observations: 150
Variables: 6
$ STG <dbl> 0.080, 0.000, 0.180, 0.100, 0.120, 0.090, 0.080, 0.150, ...
$ SCG <dbl> 0.080, 0.000, 0.180, 0.100, 0.120, 0.300, 0.325, 0.275, ...
$ STR <dbl> 0.100, 0.500, 0.550, 0.700, 0.750, 0.680, 0.620, 0.800, ...
$ LPR <dbl> 0.24, 0.20, 0.30, 0.15, 0.35, 0.18, 0.94, 0.21, 0.19, ...
$ PEG <dbl> 0.90, 0.85, 0.81, 0.90, 0.80, 0.85, 0.56, 0.81, 0.82, ...
$ UNS <chr> "High", "High", "High", "High", "High", "High", ...
knowledge_data %>%
count(UNS)
# A tibble: 3 x 2
UNS n
<chr> <int>
1 High 50
2 Low 50
3 Middle 50
RegrTask()
for regressionClassifTask()
for binary and multi-class classificationMultilabelTask()
for multi-label classification problemsCostSensTask()
for general cost-sensitive classification
With our student knowledge dataset we can build a classifier:
task <- makeClassifTask(data = knowledge_train_data,
target = "UNS")
listLearners()
class package
1 classif.ada ada,rpart
2 classif.adaboostm1 RWeka
3 classif.bartMachine bartMachine
4 classif.binomial stats
5 classif.boosting adabag,rpart
6 classif.bst bst,rpart
7 classif.C50 C50
8 classif.cforest party
9 classif.clusterSVM SwarmSVM,LiblineaR
10 classif.ctree party
...
# Define learner
lrn <- makeLearner("classif.h2o.deeplearning",
fix.factors.prediction = TRUE,
predict.type = "prob")
tic()
# Define task
task <- makeClassifTask(data = knowledge_train_data,
target = "UNS")
# Define learner
lrn <- makeLearner("classif.h2o.deeplearning",
fix.factors.prediction = TRUE)
# Fit model
model <- train(lrn,
task)
toc()
3.782 sec elapsed
Hyperparameter Tuning in R