| mlr_tuning_spaces_default {mlr3tuningspaces} | R Documentation |
Default Tuning Spaces
Description
Tuning spaces from the Bischl (2021) article.
glmnet tuning space
s
[1e-04, 10000]Logscalealpha
[0, 1]
kknn tuning space
k
[1, 50]Logscaledistance
[1, 5]kernel [“rectangular”, “optimal”, “epanechnikov”, “biweight”, “triweight”, “cos”, “inv”, “gaussian”, “rank”]
ranger tuning space
mtry.ratio
[0, 1]replace [TRUE,FALSE]
sample.fraction
[0.1, 1]num.trees
[1, 2000]
rpart tuning space
minsplit
[2, 128]Logscaleminbucket
[1, 64]Logscalecp
[1e-04, 0.1]Logscale
svm tuning space
cost
[1e-04, 10000]Logscalekernel [“polynomial”, “radial”, “sigmoid”, “linear”]
degree
[2, 5]gamma
[1e-04, 10000]Logscale
xgboost tuning space
eta
[1e-04, 1]Logscalenrounds
[1, 5000]max_depth
[1, 20]colsample_bytree
[0.1, 1]colsample_bylevel
[0.1, 1]lambda
[0.001, 1000]Logscalealpha
[0.001, 1000]Logscalesubsample
[0.1, 1]
Source
Bischl B, Binder M, Lang M, Pielok T, Richter J, Coors S, Thomas J, Ullmann T, Becker M, Boulesteix A, Deng D, Lindauer M (2021). “Hyperparameter Optimization: Foundations, Algorithms, Best Practices and Open Challenges.” 2107.05847, https://arxiv.org/abs/2107.05847.