qkgda {qkerntool} | R Documentation |
qKernel Generalized Discriminant Analysis
Description
The qkernel Generalized Discriminant Analysis is a method that deals with nonlinear discriminant analysis using kernel function operator.
Usage
## S4 method for signature 'matrix'
qkgda(x, label, kernel = "rbfbase", qpar = list(sigma = 0.1, q = 0.9),
features = 0, th = 1e-4, na.action = na.omit, ...)
## S4 method for signature 'cndkernmatrix'
qkgda(x, label, features = 0, th = 1e-4, na.action = na.omit, ...)
## S4 method for signature 'qkernmatrix'
qkgda(x, label, features = 0, th = 1e-4, ...)
Arguments
x |
the data matrix indexed by row, or a kernel matrix of |
label |
The original labels of the samples. |
kernel |
the kernel function used in training and predicting. This parameter can be set to any function, of class kernel, which computes a kernel function value between two vector arguments. qkerntool provides the most popular kernel functions which can be used by setting the kernel parameter to the following strings:
The kernel parameter can also be set to a user defined function of class kernel by passing the function name as an argument. |
qpar |
the list of hyper-parameters (kernel parameters). This is a list which contains the parameters to be used with the kernel function. Valid parameters for existing kernels are :
Hyper-parameters for user defined kernels can be passed through the qpar parameter as well. |
features |
Number of features (principal components) to return. (default: 0 , all) |
th |
the value of the eigenvalue under which principal components are ignored (only valid when features = 0). (default : 0.0001) |
na.action |
A function to specify the action to be taken if |
... |
additional parameters |
Details
The qkernel Generalized Discriminant Analysis method provides a mapping of the input vectors into high dimensional feature space, generalizing the classical Linear Discriminant Analysis to non-linear discriminant analysis.
The data can be passed to the qkgda
function in a matrix
, in addition qkgda
also supports input in the form of a
kernel matrix of class qkernmatrix
or class cndkernmatrix
.
Value
An S4 object containing the eigenvectors and their normalized projections, along with the corresponding eigenvalues and the original function.
prj |
The normalized projections on eigenvectors) |
eVal |
The corresponding eigenvalues |
eVec |
The corresponding eigenvectors |
kcall |
The formula of the function called |
cndkernf |
The kernel function used |
xmatrix |
The original data matrix |
all the slots of the object can be accessed by accessor functions.
Note
The predict function can be used to embed new data on the new space
Author(s)
Yusen Zhang
yusenzhang@126.com
References
1.Baudat, G, and F. Anouar:
Generalized discriminant analysis using a kernel approach
Neural Computation 12.10(2000),2385
2.Deng Cai, Xiaofei He, and Jiawei Han:
Speed Up Kernel Discriminant Analysis
The VLDB Journal,January,2011,vol.20, no.1,21-33.
See Also
Examples
Iris <- data.frame(rbind(iris3[,,1], iris3[,,2], iris3[,,3]), Sp = rep(c("1","2","3"), rep(50,3)))
testset <- sample(1:150,20)
train <- as.matrix(iris[-testset,-5])
test <- as.matrix(iris[testset,-5])
Sp = rep(c("1","2","3"), rep(50,3))
labels <-as.numeric(Sp)
trainlabel <- labels[-testset]
testlabel <- labels[testset]
kgda1 <- qkgda(train, label=trainlabel, kernel = "ratibase", qpar = list(c=1,q=0.9),features = 2)
prj(kgda1)
eVal(kgda1)
eVec(kgda1)
kcall(kgda1)
# xmatrix(kgda1)
#print the principal component vectors
prj(kgda1)
#plot the data projection on the components
plot(kgda1@prj,col=as.integer(train), xlab="1st Principal Component",ylab="2nd Principal Component")