EGO.cst {DiceOptim}  R Documentation 
Sequential constrained Expected Improvement maximization and model reestimation, with a number of iterations fixed in advance by the user
Description
Executes nsteps
iterations of EGO methods integrating constraints, based on objects of class km
.
At each step, kriging models are reestimated (including covariance parameters reestimation)
based on the initial design points plus the points visited during all previous iterations;
then a new point is obtained by maximizing one of the constrained Expected Improvement criteria available.
Usage
EGO.cst(
model.fun = NULL,
fun,
cheapfun = NULL,
model.constraint,
constraint,
equality = FALSE,
crit = "EFI",
nsteps,
lower,
upper,
type = "UK",
cov.reestim = TRUE,
critcontrol = NULL,
optimcontrol = list(method = "genoud", threshold = 1e05, distance = "euclidean",
notrace = FALSE),
...
)
Arguments
model.fun 
object of class 
fun 
scalar function to be minimized, corresponding to 
cheapfun 
optional scalar function to use if the objective is a fasttoevaluate function (handled next with class 
model.constraint 
either one or a list of models of class 
constraint 
vectorial function corresponding to the constraints, see details below, 
equality 
either 
crit 
choice of constrained improvement function: " 
nsteps 
an integer representing the desired number of iterations, 
lower 
vector of lower bounds for the variables to be optimized over, 
upper 
vector of upper bounds for the variables to be optimized over, 
type 
" 
cov.reestim 
optional boolean specifying if the kriging hyperparameters should be reestimated at each iteration, 
critcontrol 
optional list of parameters for criterion 
optimcontrol 
an optional list of control parameters for optimization of the selected infill criterion:

... 
additional parameters to be given to the objective 
Details
Extension of the function EGO.nsteps
to constrained optimization.
The problem considered is of the form: min f(x)
s.t. g(x) \le 0
,
g
having a vectorial output.
By default all its components are supposed to be inequalities, but one can use a boolean vector in equality
to specify which are equality constraints.
In this case one can modify the tolerance on the constraints using the tolConstraints
component of critcontrol
:
an optional vector giving a tolerance for each of the constraints (equality or inequality).
It is highly recommended to use it when there are equality constraints since the default tolerance of 0.05
in such case might not be suited.
Available infill criteria with crit
are:
Expected Probability of Feasibily (
EFI
)crit_EFI
,Augmented Lagrangian (
AL
)crit_AL
,Stepwise Uncertainty Reduction of the excursion volume (
SUR
)crit_SUR_cst
.
Depending on the selected criterion, various parameters are available.
More precisions are given in the corresponding help pages.
It is possible to consider a cheap to evaluate objective function submitted to expensive constraints.
In this case, provide only a function in cheapfun
, with both model.fun
and fun
to NULL, see examples below.
Value
A list with components:
par
: a matrix representing the additional points visited during the algorithm,values
: a vector representing the response (objective) values at the points given inpar
,constraint
: a matrix representing the constraints values at the points given inpar
,feasibility
: a boolean vector saying if points given inpar
respect the constraints,nsteps
: an integer representing the desired number of iterations (given in argument),lastmodel.fun
: an object of classkm
corresponding to the objective function,lastmodel.constraint
: one or a list of objects of classkm
corresponding to the last kriging models fitted to the constraints.
If a problem occurs during either model updates or criterion maximization, the last working model and corresponding values are returned.
Author(s)
Victor Picheny
Mickael Binois
References
D.R. Jones, M. Schonlau, and W.J. Welch (1998), Efficient global optimization of expensive blackbox functions, Journal of Global Optimization, 13, 455492.
M. Schonlau, W.J. Welch, and D.R. Jones (1998), Global versus local search in constrained optimization of computer models, Lecture NotesMonograph Series, 1125.
M.J. Sasena, P. Papalambros, and P.Goovaerts (2002), Exploration of metamodeling sampling criteria for constrained global optimization, Engineering optimization, 34, 263278.
R.B. Gramacy, G.A. Gray, S. Le Digabel, H.K.H Lee, P. Ranjan, G. Wells, Garth, and S.M. Wild (2014+), Modeling an augmented Lagrangian for improved blackbox constrained optimization, arXiv preprint arXiv:1403.4890.
J.M. Parr (2012), Improvement criteria for constraint handling and multiobjective optimization, University of Southampton.
V. Picheny (2014), A stepwise uncertainty reduction approach to constrained global optimization, Proceedings of the 17th International Conference on Artificial Intelligence and Statistics, JMLR W&CP 33, 787795.
See Also
critcst_optimizer
, crit_EFI
, crit_AL
,
crit_SUR_cst
, easyEGO.cst
Examples
#
# 2D objective function, 3 cases
#
set.seed(25468)
library(DiceDesign)
n_var < 2
fun < goldsteinprice
fun1.cst < function(x){return(branin(x) + 25)}
fun2.cst < function(x){return(3/2  x[1]  2*x[2]  .5*sin(2*pi*(x[1]^2  2*x[2])))}
# Constraint function with vectorial output
cstfun < function(x){
return(c(fun1.cst(x), fun2.cst(x)))
}
# For illustration purposes
n.grid < 31
test.grid < expand.grid(X1 = seq(0, 1, length.out = n.grid), X2 = seq(0, 1, length.out = n.grid))
obj.grid < apply(test.grid, 1, fun)
cst1.grid < apply(test.grid, 1, fun1.cst)
cst2.grid < apply(test.grid, 1, fun2.cst)
# Initial set of observations and models
n.init < 12
design.grid < round(maximinESE_LHS(lhsDesign(n.init, n_var, seed = 42)$design)$design, 1)
obj.init < apply(design.grid, 1, fun)
cst1.init < apply(design.grid, 1, fun1.cst)
cst2.init < apply(design.grid, 1, fun2.cst)
model.fun < km(~., design = design.grid, response = obj.init)
model.constraint1 < km(~., design = design.grid, response = cst1.init, lower=c(.2,.2))
model.constraint2 < km(~., design = design.grid, response = cst2.init, lower=c(.2,.2))
model.constraint < list(model.constraint1, model.constraint2)
lower < rep(0, n_var)
upper < rep(1, n_var)
#
# 1 Expected Feasible Improvement criterion, expensive objective function,
# two inequality constraints, 5 iterations, using genoud
#
cstEGO < EGO.cst(model.fun = model.fun, fun = fun, model.constraint = model.constraint,
crit = "EFI", constraint = cstfun, equality = FALSE, lower = lower,
upper = upper, nsteps = 5, optimcontrol = list(method = "genoud", maxit = 20))
# Plots: objective function in colour, constraint boundaries in red
# Initial DoE: white circles, added points: blue crosses, best solution: red cross
filled.contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid), nlevels = 50,
matrix(obj.grid, n.grid), main = "Two inequality constraints",
xlab = expression(x[1]), ylab = expression(x[2]), color = terrain.colors,
plot.axes = {axis(1); axis(2);
points(design.grid[,1], design.grid[,2], pch = 21, bg = "white")
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(cst1.grid, n.grid), level = 0, add=TRUE,drawlabels=FALSE,
lwd=1.5, col = "red")
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(cst2.grid, n.grid), level = 0, add=TRUE,drawlabels=FALSE,
lwd=1.5, col = "red")
points(cstEGO$par, col = "blue", pch = 4, lwd = 2)
}
)
#
# 2 Augmented Lagrangian Improvement criterion, expensive objective function,
# one inequality and one equality constraint, using a discrete set of candidates (grid)
#
cstEGO2 < EGO.cst(model.fun = model.fun, fun = fun, model.constraint = model.constraint,
crit = "AL", constraint = cstfun, equality = c(TRUE, FALSE), lower = lower,
upper = upper, nsteps = 10,
critcontrol = list(tolConstraints = c(2, 0), always.update=TRUE),
optimcontrol=list(method="discrete", candidate.points=as.matrix(test.grid)))
# Plots: objective function in colour, inequality constraint boundary in red,
# equality constraint in orange
# Initial DoE: white circles, added points: blue crosses, best solution: red cross
filled.contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid), nlevels = 50,
matrix(obj.grid, n.grid),
main = "Inequality (red) and equality (orange) constraints",
xlab = expression(x[1]), ylab = expression(x[2]), color = terrain.colors,
plot.axes = {axis(1); axis(2);
points(design.grid[,1], design.grid[,2], pch = 21, bg = "white")
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(cst1.grid, n.grid), level = 0, add=TRUE,
drawlabels=FALSE,lwd=1.5, col = "orange")
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(cst2.grid, n.grid), level = 0, add=TRUE,
drawlabels=FALSE,lwd=1.5, col = "red")
points(cstEGO2$par, col = "blue", pch = 4, lwd = 2)
}
)
#
# 3 Stepwise Uncertainty Reduction criterion, fast objective function,
# single inequality constraint, 5 steps, importance sampling scheme
#
cstEGO3 < EGO.cst(model.fun = NULL, fun = NULL, cheapfun = fun,
model.constraint = model.constraint2, constraint = fun2.cst,
crit = "SUR", lower = lower, upper = upper,
nsteps =5, critcontrol=list(distrib="SUR"))
# Plots: objective function in colour, inequality constraint boundary in red,
# Initial DoE: white circles, added points: blue crosses, best solution: red cross
filled.contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid), nlevels = 50,
matrix(obj.grid, n.grid), main = "Single constraint, fast objective",
xlab = expression(x[1]), ylab = expression(x[2]), color = terrain.colors,
plot.axes = {axis(1); axis(2);
points(design.grid[,1], design.grid[,2], pch = 21, bg = "white")
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(obj.grid, n.grid), nlevels = 10, add = TRUE,
drawlabels = TRUE)
contour(seq(0, 1, length.out = n.grid), seq(0, 1, length.out = n.grid),
matrix(cst2.grid, n.grid), level = 0, add=TRUE,
drawlabels=FALSE,lwd=1.5, col = "black")
points(cstEGO3$par, col = "blue", pch = 4, lwd = 2)
}
)