mrpls {plsgenomics} | R Documentation |
Ridge Partial Least Square for categorical data
Description
The function mrpls
performs prediction using Fort et al. (2005) MRPLS algorithm.
Usage
mrpls(Ytrain,Xtrain,Lambda,ncomp,Xtest=NULL,NbIterMax=50)
Arguments
Xtrain |
a (ntrain x p) data matrix of predictors. |
Ytrain |
a ntrain vector of responses. |
Xtest |
a (ntest x p) matrix containing the predictors for the test data
set. |
Lambda |
a positive real value. |
ncomp |
a positive integer. |
NbIterMax |
a positive integer. |
Details
The columns of the data matrices Xtrain
and Xtest
may not be standardized,
since standardizing is performed by the function mrpls
as a preliminary step
before the algorithm is run.
The procedure described in Fort et al. (2005) is used to determine
latent components to be used for classification and when Xtest
is not equal to NULL, the procedure predicts the labels for these new
predictor variables.
Value
A list with the following components:
Coefficients |
the (p+1) x c matrix containing the coefficients weighting the block design matrix. |
hatY |
the ntrain vector containing the estimated {0,...,c}-valued labels for the
observations from |
hatYtest |
the ntest vector containing the predicted {0,...,c}-valued labels for the
observations from |
proba |
the ntrain vector containing the estimated probabilities for the
observations from |
proba.test |
the ntest vector containing the predicted probabilities for the
observations from |
DeletedCol |
the vector containing the column number of |
hatYtest_k |
If |
Author(s)
Sophie Lambert-Lacroix (http://membres-timc.imag.fr/Sophie.Lambert/).
References
G. Fort, S. Lambert-Lacroix and Julie Peyre (2005). Reduction de dimension dans les modeles lineaires generalises : application a la classification supervisee de donnees issues des biopuces. Journal de la SFDS, tome 146, n1-2, 117-152.
See Also
Examples
# load plsgenomics library
library(plsgenomics)
# load SRBCT data
data(SRBCT)
IndexLearn <- c(sample(which(SRBCT$Y==1),10),sample(which(SRBCT$Y==2),4),
sample(which(SRBCT$Y==3),7),sample(which(SRBCT$Y==4),9))
# perform prediction by MRPLS
res <- mrpls(Ytrain=SRBCT$Y[IndexLearn]-1,Xtrain=SRBCT$X[IndexLearn,],Lambda=0.001,ncomp=2,
Xtest=SRBCT$X[-IndexLearn,])
sum(res$Ytest!=SRBCT$Y[-IndexLearn]-1)
# prediction for another sample
Xnew <- SRBCT$X[83,]
# Compute the linear predictor for each classes expect class 1
eta <- diag(t(cbind(c(1,Xnew),c(1,Xnew),c(1,Xnew))) %*% res$Coefficients)
Ypred <- which.max(c(0,eta))
Ypred+1
SRBCT$Y[83]