kpcca {evclust}R Documentation

Kernel Pairwise Constrained Component Analysis (KPCCA)

Description

Using must-link and cannot-link constaints, KPCCA (Mignon & Jury, 2012) learns a projection into a low-dimensional space where the distances between pairs of data points respect the desired constraints, exhibiting good generalization properties in presence of high dimensional data. This is a kernelized version of pcca.

Usage

kpcca(K, d1, ML, CL, beta = 1, epsi = 1e-04, etamax = 0.1, disp = TRUE)

Arguments

K

Gram matrix of size n*n

d1

Number of extracted features.

ML

Matrix nbML x 2 of must-link constraints. Each row of ML contains the indices of objects that belong to the same class.

CL

Matrix nbCL x 2 of cannot-link constraints. Each row of CL contains the indices of objects that belong to different classes.

beta

Sharpness parameter in the loss function (default: 1).

epsi

Minimal rate of change of the cost function (default: 1e-4).

etamax

Maximum step in the line search algorithm (default: 0.1).

disp

If TRUE (default), intermediate results are displayed.

Value

A list with three attributes:

z

The n*d1 matrix of extracted features.

A

The projection matrix of size d1*n.

D

The Euclidean distance matrix in the projected space.

Author(s)

Thierry Denoeux.

References

A. Mignon and F. Jurie. PCCA: a new approach for distance learning from sparse pairwise constraints. In 2012 IEEE Conference on Computer Vision and Pattern Recognition, pages 2666-2672, 2012.

See Also

pcca, create_MLCL

Examples

## Not run: 
library(kernlab)
data<-bananas(400)
plot(data$x,pch=data$y,col=data$y)
const<-create_MLCL(data$y,1000)
rbf <- rbfdot(sigma = 0.2)
K<-kernelMatrix(rbf,data$x)
res.kpcca<-kpcca(K,d1=1,ML=const$ML,CL=const$CL,beta=1)
plot(res.kpcca$z,col=data$y)

## End(Not run)


[Package evclust version 2.0.3 Index]