ENNreg_cv {evreg} | R Documentation |
Hyperparameter tuning for the ENNreg model using cross-validation
Description
ENNreg_cv
tunes parameters xi and rho of the ENNreg model using cross-validation.
Usage
ENNreg_cv(
X,
y,
K,
batch = TRUE,
folds = NULL,
Kfold = 5,
XI,
RHO,
nstart = 100,
c = 1,
lambda = 0.9,
eps = NULL,
nu = 1e-16,
optimProto = TRUE,
verbose = TRUE,
options = list(maxiter = 1000, rel.error = 1e-04, print = 10),
opt.rmsprop = list(batch_size = 100, epsi = 0.001, rho = 0.9, delta = 1e-08, Dtmax =
100)
)
Arguments
X |
Input matrix of size n x p, where n is the number of objects and p the number of attributes. |
y |
Vector of length n containing observations of the response variable. |
K |
Number of prototypes. |
batch |
If TRUE (default), batch learning is used; otherwise, online learning is used. |
folds |
Vector of length n containing the folds (integers between 1 and Kfold). |
Kfold |
Number of folds (default=5, used only if |
XI |
Vector of candidate values for hyperparameter |
RHO |
Vector of candidate values for hyperparameter |
nstart |
Number of random starts of the k-means algorithm (default: 100). |
c |
Multiplicative coefficient applied to scale parameter gamma (defaut: 1). |
lambda |
Parameter of the loss function (default=0.9). |
eps |
Parameter of the loss function (if NULL, fixed to 0.01 times the standard deviation of y). |
nu |
Parameter of the loss function to avoid a division par zero (default=1e-16). |
optimProto |
If TRUE (default), the initial prototypes are optimized. |
verbose |
If TRUE (default) intermediate results are displayed. |
options |
Parameters of the optimization algorithm (see |
opt.rmsprop |
Parameters of the RMSprop algorithm (see |
Details
Either the folds (a vector of the same length as y, such that folds[i]
equals the
fold, between 1 and Kfold, containing observation i), or the number of folds must be provided.
Arguments options
and opt.rmsprop
are passed to function ENNreg
.
Value
A list with three components:
- xi
Optimal value of xi.
- rho
Optimal value of rho.
- RMS
Matrix of root mean squared error values
.
References
Thierry Denoeux. An evidential neural network model for regression based on random fuzzy numbers. In "Belief functions: Theory and applications (proc. of BELIEF 2022)", pages 57-66, Springer, 2022.
Thierry Denoeux. Quantifying prediction uncertainty in regression using random fuzzy sets: the ENNreg model. IEEE Transactions on Fuzzy Systems, Vol. 31, Issue 10, pages 3690-3699, 2023.
See Also
Examples
# Boston dataset
library(MASS)
X<-as.matrix(scale(Boston[,1:13]))
y<-Boston[,14]
set.seed(220322)
n<-nrow(Boston)
ntrain<-round(0.7*n)
train <-sample(n,ntrain)
cv<-ENNreg_cv(X=X[train,],y=y[train],K=30,XI=c(0.1,1,10),RHO=c(0.1,1,10))
cv$RMS
fit <- ENNreg(X[train,],y[train],K=30,xi=cv$xi,rho=cv$rho)
pred<-predict(fit,newdata=X[-train,],yt=y[-train])
print(pred$RMS)