| predict.IASSMR {fsemipar} | R Documentation |
Prediction for MFPLSIM
Description
predict method for the multi-functional partial linear single-index model (MFPLSIM) fitted using IASSMR.kernel.fit or IASSMR.kNN.fit.
Usage
## S3 method for class 'IASSMR.kernel'
predict(object, newdata.x = NULL, newdata.z = NULL,
y.test = NULL, option = NULL, ...)
## S3 method for class 'IASSMR.kNN'
predict(object, newdata.x = NULL, newdata.z = NULL,
y.test = NULL, option = NULL, knearest.n = object$knearest,
min.knn.n = object$min.knn, max.knn.n = object$max.knn.n,
step.n = object$step, ...)
Arguments
object |
Output of the functions mentioned in the |
newdata.x |
A matrix containing new observations of the functional covariate in the functional single-index component, collected by row. |
newdata.z |
Matrix containing the new observations of the scalar covariates derived from the discretisation of a curve, collected by row. |
y.test |
(optional) A vector containing the new observations of the response. |
option |
Allows the choice between 1, 2 and 3. The default is 1. See the section |
... |
Further arguments. |
knearest.n |
Only used for objects |
min.knn.n |
Only used for objects |
max.knn.n |
Only used for objects |
step.n |
Only used for objects |
Details
Three options are provided to obtain the predictions of the response for newdata.x and newdata.z:
If
option=1, we maintain all the estimates (k.optorh.opt,theta.estandbeta.est) to predict the functional single-index component of the model. As we use the estimates of the second step of the algorithm, only thetrain.2is used as training sample to predict. Then, it should be noted thatk.optorh.optmay not be suitable to predict the functional single-index component of the model.If
option=2, we maintaintheta.estandbeta.est, while the tuning parameter (hork) is selected again to predict the functional single-index component of the model. This selection is performed using the leave-one-out cross-validation criterion in the functional single-index model associated and the complete training sample (i.e.train=c(train.1,train.2)). As we use the entire training sample (not just a subsample of it), the sample size is modified and, as a consequence, the parametersknearest,min.knn,max.knn,stepgiven to the functionIASSMR.kNN.fitmay need to be provided again to compute predictions. For that, we add the argumentsknearest.n,min.knn.n,max.knn.nandstep.n.If
option=3, we maintain only the indexes of the relevant variables selected by the IASSMR. We estimate again the linear coefficients and the functional index by means ofsfplsim.kernel.fitorsfplsim.kNN.fit, respectively, without penalisation (settinglambda.seq=0) and using the whole training sample (train=c(train.1,train.2)). The method provides two predictions (and MSEPs):a) The prediction associated with
option=1forsfplsim.kernelorsfplsim.kNNclass.b) The prediction associated with
option=2forsfplsim.kernelorsfplsim.kNNclass.
(see the documentation of the functions
predict.sfplsim.kernelandpredict.sfplsim.kNN)
Value
The function returns the predicted values of the response (y) for newdata.x and newdata.z. If !is.null(y.test), it also provides the mean squared error of prediction (MSEP) computed as mean((y-y.test)^2).
If option=3, two sets of predictions (and two MSEPs) are provided, corresponding to the items a) and b) mentioned in the section Details.
If is.null(newdata.x) or is.null(newdata.z), the function returns the fitted values.
Author(s)
German Aneiros Perez german.aneiros@udc.es
Silvia Novo Diaz snovo@est-econ.uc3m.es
See Also
sfplsim.kernel.fit, sfplsim.kNN.fit, IASSMR.kernel.fit, IASSMR.kNN.fit.
Examples
data(Sugar)
y<-Sugar$ash
x<-Sugar$wave.290
z<-Sugar$wave.240
#Outliers
index.y.25 <- y > 25
index.atip <- index.y.25
(1:268)[index.atip]
#Dataset to model
x.sug <- x[!index.atip,]
z.sug<- z[!index.atip,]
y.sug <- y[!index.atip]
train<-1:216
test<-217:266
#Fit
fit.kernel<-IASSMR.kernel.fit(x=x.sug[train,],z=z.sug[train,], y=y.sug[train],
train.1=1:108,train.2=109:216,nknot.theta=2,lambda.min.h=0.03,
lambda.min.l=0.03, max.q.h=0.35, nknot=20,criterion="BIC",
max.iter=5000)
fit.kNN<- IASSMR.kNN.fit(x=x.sug[train,],z=z.sug[train,], y=y.sug[train],
train.1=1:108,train.2=109:216,nknot.theta=2,lambda.min.h=0.07,
lambda.min.l=0.07, max.knn=20, nknot=20,criterion="BIC",
max.iter=5000)
#Predictions
predict(fit.kernel,newdata.x=x.sug[test,],newdata.z=z.sug[test,],y.test=y.sug[test],option=2)
predict(fit.kNN,newdata.x=x.sug[test,],newdata.z=z.sug[test,],y.test=y.sug[test],option=2)