mlr_loop_functions_ego {mlr3mbo}R Documentation

Sequential Single-Objective Bayesian Optimization

Description

Loop function for sequential single-objective Bayesian Optimization. Normally used inside an OptimizerMbo.

In each iteration after the initial design, the surrogate and acquisition function are updated and the next candidate is chosen based on optimizing the acquisition function.

Usage

bayesopt_ego(
  instance,
  surrogate,
  acq_function,
  acq_optimizer,
  init_design_size = NULL,
  random_interleave_iter = 0L
)

Arguments

instance

(bbotk::OptimInstanceSingleCrit)
The bbotk::OptimInstanceSingleCrit to be optimized.

surrogate

(Surrogate)
Surrogate to be used as a surrogate. Typically a SurrogateLearner.

acq_function

(AcqFunction)
AcqFunction to be used as acquisition function.

acq_optimizer

(AcqOptimizer)
AcqOptimizer to be used as acquisition function optimizer.

init_design_size

(NULL | integer(1))
Size of the initial design. If NULL and the bbotk::Archive contains no evaluations, 4 * d is used with d being the dimensionality of the search space. Points are generated via a Sobol sequence.

random_interleave_iter

(integer(1))
Every random_interleave_iter iteration (starting after the initial design), a point is sampled uniformly at random and evaluated (instead of a model based proposal). For example, if random_interleave_iter = 2, random interleaving is performed in the second, fourth, sixth, ... iteration. Default is 0, i.e., no random interleaving is performed at all.

Value

invisible(instance)
The original instance is modified in-place and returned invisible.

Note

References

See Also

Other Loop Function: loop_function, mlr_loop_functions, mlr_loop_functions_emo, mlr_loop_functions_mpcl, mlr_loop_functions_parego, mlr_loop_functions_smsego

Examples


if (requireNamespace("mlr3learners") &
    requireNamespace("DiceKriging") &
    requireNamespace("rgenoud")) {

  library(bbotk)
  library(paradox)
  library(mlr3learners)

  fun = function(xs) {
    list(y = xs$x ^ 2)
  }
  domain = ps(x = p_dbl(lower = -10, upper = 10))
  codomain = ps(y = p_dbl(tags = "minimize"))
  objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)

  instance = OptimInstanceSingleCrit$new(
    objective = objective,
    terminator = trm("evals", n_evals = 5))

  surrogate = default_surrogate(instance)

  acq_function = acqf("ei")

  acq_optimizer = acqo(
    optimizer = opt("random_search", batch_size = 100),
    terminator = trm("evals", n_evals = 100))

  optimizer = opt("mbo",
    loop_function = bayesopt_ego,
    surrogate = surrogate,
    acq_function = acq_function,
    acq_optimizer = acq_optimizer)

  optimizer$optimize(instance)

  # expected improvement per second example
  fun = function(xs) {
    list(y = xs$x ^ 2, time = abs(xs$x))
  }
  domain = ps(x = p_dbl(lower = -10, upper = 10))
  codomain = ps(y = p_dbl(tags = "minimize"), time = p_dbl(tags = "time"))
  objective = ObjectiveRFun$new(fun = fun, domain = domain, codomain = codomain)

  instance = OptimInstanceSingleCrit$new(
    objective = objective,
    terminator = trm("evals", n_evals = 5))

  surrogate = default_surrogate(instance, n_learner = 2)
  surrogate$cols_y = c("y", "time")

  optimizer = opt("mbo",
    loop_function = bayesopt_ego,
    surrogate = surrogate,
    acq_function = acqf("eips"),
    acq_optimizer = acq_optimizer)

  optimizer$optimize(instance)
}


[Package mlr3mbo version 0.2.2 Index]