dkplsr {rchemo} | R Documentation |
Direct KPLSR Models
Description
Direct kernel PLSR (DKPLSR) (Bennett & Embrechts 2003). The method builds kernel Gram matrices and then runs a usual PLSR algorithm on them. This is faster (but not equivalent) to the "true" NIPALS KPLSR algorithm such as described in Rosipal & Trejo (2001).
Usage
dkplsr(X, Y, weights = NULL, nlv, kern = "krbf", ...)
## S3 method for class 'Dkpls'
transform(object, X, ..., nlv = NULL)
## S3 method for class 'Dkpls'
coef(object, ..., nlv = NULL)
## S3 method for class 'Dkplsr'
predict(object, X, ..., nlv = NULL)
Arguments
X |
For the main function: Matrix with the training X-data ( |
Y |
Matrix with the training Y-data ( |
weights |
vector of weights ( |
nlv |
For the main function: The number(s) of LVs to calculate. — For auxiliary functions: The number(s) of LVs to consider. |
kern |
Name of the function defining the considered kernel for building the Gram matrix. See |
... |
Optional arguments to pass in the kernel function defined in |
object |
For auxiliary functions: A fitted model, output of a call to the main function. |
Value
For dkplsr
:
X |
Matrix with the training X-data ( |
fm |
List with the outputs of the PLSR (( |
K |
kernel Gram matrix |
kern |
kernel function |
dots |
Optional arguments passed in the kernel function |
For transform.Dkplsr
: A matrix (m, nlv
) with the projection of the new X-data on the X-scores
For predict.Dkplsr
:
pred |
A list of matrices ( |
K |
kernel Gram matrix ( |
For coef.Dkplsr
:
int |
matrix (1,nlv) with the intercepts |
B |
matrix (n,nlv) with the coefficients |
Note
The second example concerns the fitting of the function sinc(x) described in Rosipal & Trejo 2001 p. 105-106
References
Bennett, K.P., Embrechts, M.J., 2003. An optimization perspective on kernel partial least squares regression, in: Advances in Learning Theory: Methods, Models and Applications, NATO Science Series III: Computer & Systems Sciences. IOS Press Amsterdam, pp. 227-250.
Rosipal, R., Trejo, L.J., 2001. Kernel Partial Least Squares Regression in Reproducing Kernel Hilbert Space. Journal of Machine Learning Research 2, 97-123.
Examples
## EXAMPLE 1
n <- 6 ; p <- 4
Xtrain <- matrix(rnorm(n * p), ncol = p)
ytrain <- rnorm(n)
Ytrain <- cbind(y1 = ytrain, y2 = 100 * ytrain)
m <- 3
Xtest <- Xtrain[1:m, , drop = FALSE]
Ytest <- Ytrain[1:m, , drop = FALSE] ; ytest <- Ytest[1:m, 1]
nlv <- 2
fm <- dkplsr(Xtrain, Ytrain, nlv = nlv, kern = "krbf", gamma = .8)
transform(fm, Xtest)
transform(fm, Xtest, nlv = 1)
coef(fm)
coef(fm, nlv = 1)
predict(fm, Xtest)
predict(fm, Xtest, nlv = 0:nlv)$pred
pred <- predict(fm, Xtest)$pred
msep(pred, Ytest)
nlv <- 2
fm <- dkplsr(Xtrain, Ytrain, nlv = nlv, kern = "kpol", degree = 2, coef0 = 10)
predict(fm, Xtest, nlv = nlv)
## EXAMPLE 2
x <- seq(-10, 10, by = .2)
x[x == 0] <- 1e-5
n <- length(x)
zy <- sin(abs(x)) / abs(x)
y <- zy + rnorm(n, 0, .2)
plot(x, y, type = "p")
lines(x, zy, lty = 2)
X <- matrix(x, ncol = 1)
nlv <- 3
fm <- dkplsr(X, y, nlv = nlv)
pred <- predict(fm, X)$pred
plot(X, y, type = "p")
lines(X, zy, lty = 2)
lines(X, pred, col = "red")