| fill.HardImpute {filling} | R Documentation | 
HardImpute : Generalized Spectral Regularization
Description
If the assumed underlying model has sufficiently many zeros, the LASSO type
shrinkage estimator is known to overestimate the number of non-zero coefficients.
fill.HardImpute aims at overcoming such difficulty via low-rank assumption
and hard thresholding idea, well-known concept in conventional regression analysis.
In algorithmic aspect, it takes output of SoftImpute as warm-start matrices
for iterative estimation process.
Usage
fill.HardImpute(
  A,
  lambdas = c(10, 1, 0.1),
  maxiter = 100,
  tol = 0.001,
  rk = (min(dim(A)) - 1)
)
Arguments
| A | an  | 
| lambdas | a length- | 
| maxiter | maximum number of iterations to be performed. | 
| tol | stopping criterion for an incremental progress. | 
| rk | assumed rank of the matrix. | 
Value
a named list containing
- X
- an - (n\times p\times t)cubic array after completion at each- lambdavalue.
References
Mazumder R, Hastie T, Tibshirani R (2010). “Spectral Regularization Algorithms for Learning Large Incomplete Matrices.” J. Mach. Learn. Res., 11, 2287–2322. ISSN 1532-4435.
See Also
Examples
## load image data of 'lena128'
data(lena128)
## transform 5% of entries into missing
A <- aux.rndmissing(lena128, x=0.05)
## apply the method with 3 rank conditions
fill1 <- fill.HardImpute(A, lambdas=c(500,100,50), rk=10)
fill2 <- fill.HardImpute(A, lambdas=c(500,100,50), rk=50)
fill3 <- fill.HardImpute(A, lambdas=c(500,100,50), rk=100)
## visualize only the last ones from each run
opar <- par(no.readonly=TRUE)
par(mfrow=c(2,2), pty="s")
image(A, col=gray((0:100)/100), axes=FALSE, main="5% missing")
image(fill1$X[,,3], col=gray((0:100)/100), axes=FALSE, main="Rank 10")
image(fill2$X[,,3], col=gray((0:100)/100), axes=FALSE, main="Rank 50")
image(fill3$X[,,3], col=gray((0:100)/100), axes=FALSE, main="Rank 100")
par(opar)