Choose the number of principal components via reconstruction error {choosepc}R Documentation

Choose the number of principal components via reconstruction error

Description

Choose the number of principal components via reconstruction error.

Usage

pc.choose(x, graph = TRUE)

Arguments

x

A numerical matrix with more rows than columns.

graph

Should the plot of the PRESS values appear? Default value is TRUE.

Details

SVD stands for Singular Value Decomposition of a rectangular matrix. That is any matrix, not only a square one in contrast to the Spectral Decomposition with eigenvalues and eigenvectors, produced by principal component analysis (PCA). Suppose we have a n \times p matrix \bf X. Then using SVD we can write the matrix as

{\bf X}={\bf UDV}^{T},

where \bf U is an orthonormal matrix containing the eigenvectors of {\bf XX}^T, the \bf V is an orthonormal matrix containing the eigenvectors of {\bf X}^T{\bf X} and D is a p \times p diagonal matrix containing the r non zero singular values d_1,\ldots,d_r (square root of the eigenvalues) of {\bf XX}^T (or {\bf X}^T{\bf X}) and the remaining p-r elements of the diagonal are zero. We remind that the maximum rank of an n \times p matrix is equal to \min\{n,p\}. Using the SVD decomposition equaiton above, each column of \bf X can be written as

{\bf x}_j=\sum_{k=1}^r{\bf u}_kd_k{\bf v}_{jk}.

This means that we can reconstruct the matrix \bf X using less columns (if n>p) than it has.

\tilde{{\bf x}}^{m}_j=\sum_{k=1}^m{\bf u}_kd_k{\bf v}_{jk},

where m<r.

The reconstructed matrix will have some discrepancy of course, but it is the level of discrepancy we are interested in. If we center the matrix \bf X, subtract the column means from every column, and perform the SVD again, we will see that the orthonormal matrix \bf V contains the eigenvectors of the covariance matrix of the original, the un-centred, matrix \bf X.

Coming back to the a matrix of n observations and p variables, the question was how many principal components to retain. We will give an answer to this using SVD to reconstruct the matrix. We describe the steps of this algorithm below. 1. Center the matrix by subtracting from each variable its mean {\bf Y}={\bf X}-{\bf m}

2. Perform SVD on the centred matrix \bf Y.

3. Choose a number from 1 to r (the rank of the matrix) and reconstruct the matrix. Let us denote by \widetilde{{\bf Y}}^{m} the reconstructed matrix.

4. Calculate the sum of squared differences between the reconstructed and the original values

PRESS\left(m\right)=\sum_{i=1}^n\sum_{j=1}^p\left(\tilde{y}^{m}_{ij}-y_{ij}\right)^2, m=1,..,r.

5. Plot PRESS\left(m\right) for all the values of m and choose graphically the number of principal components.

The graphical way of choosing the number of principal components is not the best and there alternative ways of making a decision (see for example Jolliffe (2002)).

Value

A list including:

values

The eigenvalues of the covariance matrix.

cumprop

The cumulative proportion of the eigenvalues of the covariance matrix.

per

The differences in the cumulative proportion of the eigenvalues of the covariance matrix.

press

The reconstruction error \sqrt{\sum_{ij}{(x_{ij}-\hat{x}_{ij})^2}} for each number of eigenvectors.

runtime

The runtime of the algorithm.

Author(s)

Michail Tsagris.

R implementation and documentation: Michail Tsagris mtsagris@uoc.gr.

References

Jolliffe I.T. (2002). Principal Component Analysis.

See Also

eigci

Examples

x <- as.matrix(iris[, 1:4])
a <- pc.choose(x, graph = FALSE)

[Package choosepc version 1.0 Index]