MHFA {HDMFA} | R Documentation |
Matrix Huber Factor Analysis
Description
This function is to fit the matrix factor models via the Huber loss. We propose two algorithms to do robust factor analysis. One is based on minimizing the Huber loss of the idiosyncratic error's Frobenius norm, which leads to a weighted iterative projection approach to compute and learn the parameters and thereby named as Robust-Matrix-Factor-Analysis (RMFA). The other one is based on minimizing the element-wise Huber loss, which can be solved by an iterative Huber regression algorithm (IHR).
Usage
MHFA(X, W1=NULL, W2=NULL, m1, m2, method, max_iter = 100, ep = 1e-04)
Arguments
X |
Input an array with |
W1 |
Only if |
W2 |
Only if |
m1 |
A positive integer indicating the row factor numbers. |
m2 |
A positive integer indicating the column factor numbers. |
method |
Character string, specifying the type of the estimation method to be used.
|
max_iter |
Only if |
ep |
Only if |
Details
For the matrix factor models, He et al. (2021) propose a weighted iterative projection approach to compute and learn the parameters by minimizing the Huber loss function of the idiosyncratic error's Frobenius norm. In details, for observations , define
The estimators of loading matrics and
are calculated by
times the leading
eigenvectors of
and
times the leading
eigenvectors of
.
And
For details, see He et al. (2023).
The other one is based on minimizing the element-wise Huber loss. Define
This can be seen as Huber regression as each time optimizing one argument while keeping the other two fixed.
Value
The return value is a list. In this list, it contains the following:
F |
The estimated factor matrix of dimension |
R |
The estimated row loading matrix of dimension |
C |
The estimated column loading matrix of dimension |
Author(s)
Yong He, Changwei Zhao, Ran Zhao.
References
He, Y., Kong, X., Yu, L., Zhang, X., & Zhao, C. (2023). Matrix factor analysis: From least squares to iterative projection. Journal of Business & Economic Statistics, 1-26.
He, Y., Kong, X. B., Liu, D., & Zhao, R. (2023). Robust Statistical Inference for Large-dimensional Matrix-valued Time Series via Iterative Huber Regression. <arXiv:2306.03317>.
Examples
set.seed(11111)
T=20;p1=20;p2=20;k1=3;k2=3
R=matrix(runif(p1*k1,min=-1,max=1),p1,k1)
C=matrix(runif(p2*k2,min=-1,max=1),p2,k2)
X=array(0,c(T,p1,p2))
Y=X;E=Y
F=array(0,c(T,k1,k2))
for(t in 1:T){
F[t,,]=matrix(rnorm(k1*k2),k1,k2)
E[t,,]=matrix(rnorm(p1*p2),p1,p2)
Y[t,,]=R%*%F[t,,]%*%t(C)
}
X=Y+E
#Estimate the factor matrices and loadings by RMFA
fit1=MHFA(X, m1=3, m2=3, method="P")
Rhat1=fit1$R
Chat1=fit1$C
Fhat1=fit1$F
#Estimate the factor matrices and loadings by IHR
fit2=MHFA(X, W1=NULL, W2=NULL, 3, 3, "E")
Rhat2=fit2$R
Chat2=fit2$C
Fhat2=fit2$F
#Estimate the common component by RMFA
CC1=array(0,c(T,p1,p2))
for (t in 1:T){
CC1[t,,]=Rhat1%*%Fhat1[t,,]%*%t(Chat1)
}
CC1
#Estimate the common component by IHR
CC2=array(0,c(T,p1,p2))
for (t in 1:T){
CC2[t,,]=Rhat2%*%Fhat2[t,,]%*%t(Chat2)
}
CC2