| tucker {rTensor} | R Documentation |
Tucker Decomposition
Description
The Tucker decomposition of a tensor. Approximates a K-Tensor using a n-mode product of a core tensor (with modes specified by ranks) with orthogonal factor matrices. If there is no truncation in one of the modes, then this is the same as the MPCA, mpca. If there is no truncation in all the modes (i.e. ranks = tnsr@modes), then this is the same as the HOSVD, hosvd. This is an iterative algorithm, with two possible stopping conditions: either relative error in Frobenius norm has gotten below tol, or the max_iter number of iterations has been reached. For more details on the Tucker decomposition, consult Kolda and Bader (2009).
Usage
tucker(tnsr, ranks = NULL, max_iter = 25, tol = 1e-05)
Arguments
tnsr |
Tensor with K modes |
ranks |
a vector of the modes of the output core Tensor |
max_iter |
maximum number of iterations if error stays above |
tol |
relative Frobenius norm error tolerance |
Details
Uses the Alternating Least Squares (ALS) estimation procedure also known as Higher-Order Orthogonal Iteration (HOOI). Intialized using a (Truncated-)HOSVD. A progress bar is included to help monitor operations on large tensors.
Value
a list containing the following:
Zthe core tensor, with modes specified by
ranksUa list of orthgonal factor matrices - one for each mode, with the number of columns of the matrices given by
ranksconvwhether or not
resid<tolby the last iterationestestimate of
tnsrafter compressionnorm_percentthe percent of Frobenius norm explained by the approximation
fnorm_residthe Frobenius norm of the error
fnorm(est-tnsr)all_residsvector containing the Frobenius norm of error for all the iterations
Note
The length of ranks must match tnsr@num_modes.
References
T. Kolda, B. Bader, "Tensor decomposition and applications". SIAM Applied Mathematics and Applications 2009.
See Also
Examples
tnsr <- rand_tensor(c(4,4,4,4))
tuckerD <- tucker(tnsr,ranks=c(2,2,2,2))
tuckerD$conv
tuckerD$norm_percent
plot(tuckerD$all_resids)