harris {evclust}R Documentation

Harris gradient-based optimization algorithm

Description

The optimization algorithm implemented in harris is described on Silva & Almeida (1990) and summarized in Denoeux & Masson (2004). The four parameters are:

options[1]

Display parameter : 1 (default) displays some results.

options[2]

Maximum number of iterations (default: 100).

options[3]

Relative error for stopping criterion (default: 1e-4).

options[4]

Number of iterations between two displays.

Usage

harris(fun, x, options = c(1, 100, 1e-04, 10), tr = FALSE, ...)

Arguments

fun

Function to be optimized. The function 'fun' should return a scalar function value 'fun' and a vector 'grad' containing the partial derivatives of fun at x.

x

Initial value (a vector).

options

Vector of parameters (see details).

tr

If TRUE, returns a trace of objective function vs CPU time

...

Additional parameters passed to fun

Value

A list with three attributes:

par

The minimizer of fun found.

value

The value of fun at par.

trace

The trace, a list with two attributes: 'time' and 'fct' (if tr==TRUE).

Author(s)

Thierry Denoeux.

References

F. M. Silva and L. B. Almeida. Speeding up backpropagation. In Advanced Neural Computers, R. Eckmiller, ed., Elsevier-North-Holland, New-York, 151-158, 1990.

T. Denoeux and M.-H. Masson. EVCLUS: Evidential Clustering of Proximity Data. IEEE Transactions on Systems, Man and Cybernetics B, Vol. 34, Issue 1, 95–109, 2004.

See Also

pcca

Examples

opt<-harris(function(x) return(list(fun=sum(x^2),grad=2*x)),rnorm(2),tr=TRUE)
print(c(opt$par,opt$value))
plot(opt$trace$fct,type="l")


[Package evclust version 2.0.3 Index]