GSS {skedastic} | R Documentation |
Golden Section Search for Minimising Univariate Function over a Closed Interval
Description
Golden Section Search (GSS) is a useful algorithm for minimising a
continuous univariate function f(x)
over an interval
\left[a,b\right]
in instances where the first derivative
f'(x)
is not easily obtainable, such as with loss functions
that need to be minimised under cross-validation to tune a
hyperparameter in a machine learning model. The method is described
by Fox (2021).
Usage
GSS(f, a, b, tol = 1e-08, maxitgss = 100L, ...)
Arguments
f |
A function of one variable that returns a numeric value |
a |
A numeric of length 1 representing the lower endpoint of the search interval |
b |
A numeric of length 1 representing the upper endpoint of the
search interval; must be greater than |
tol |
A numeric of length 1 representing the tolerance used to determine when the search interval has been narrowed sufficiently for convergence |
maxitgss |
An integer of length 1 representing the maximum number of iterations to use in the search |
... |
Additional arguments to pass to |
Details
This function is modelled after a MATLAB function written by (). The method assumes that there is one local minimum in this interval. The solution produced is also an interval, the width of which can be made arbitrarily small by setting a tolerance value. Since the desired solution is a single point, the midpoint of the final interval can be taken as the best approximation to the solution.
The premise of the method is to shorten the interval
\left[a,b\right]
by comparing the values of the function at two
test points, x_1 < x_2
, where x_1 = a + r(b-a)
and
x_2 = b - r(b-a)
, and r=(\sqrt{5}-1)/2\approx 0.618
is the
reciprocal of the golden ratio. One compares f(x_1)
to f(x_2)
and updates the search interval \left[a,b\right]
as follows:
If
f(x_1)<f(x_2)
, the solution cannot lie in\left[x_2,b\right]
; thus, update the search interval to\left[a_\mathrm{new},b_\mathrm{new}\right]=\left[a,x_2\right]
If
f(x_1)>f(x_2)
, the solution cannot lie in\left[a,x_1\right]
; thus, update the search interval to\left[a_\mathrm{new},b_\mathrm{new}\right]=\left[x_1,b\right]
One then chooses two new test points by replacing a
and b
with
a_\mathrm{new}
and b_\mathrm{new}
and recalculating x_1
and x_2
based on these new endpoints. One continues iterating in
this fashion until b-a< \tau
, where \tau
is the desired
tolerance.
Value
A list object containing the following:
-
argmin
, the argument off
that minimisesf
-
funmin
, the minimum value off
achieved atargmin
-
converged
, a logical indicating whether the convergence tolerance was satisfied -
iterations
, an integer indicating the number of search iterations used
References
(????).
Golden Section Method Algorithm, author=Katarzyna Zarnowiec, organization=MATLAB Central File Exchange, url=https://www.mathworks.com/matlabcentral/fileexchange/25919-golden-section-method-algorithm, urldate=2022-10-19, year=2022, .
Fox WP (2021).
Nonlinear Optimization: Models and Applications, 1st edition.
Chapman and Hall/CRC, Boca Raton, FL.
See Also
goldsect
is similar to this function, but does
not allow the user to pass additional arguments to f
Examples
f <- function(x) (x - 1) ^ 2
GSS(f, a = 0, b = 2)