alpha_PCA {HDMFA} | R Documentation |
Statistical Inference for High-Dimensional Matrix-Variate Factor Model
Description
This function is to fit the matrix factor model via the \alpha
-PCA method by conducting eigen-analysis of a weighted average of the sample mean and the column (row) sample covariance matrix through a hyper-parameter \alpha
.
Usage
alpha_PCA(X, m1, m2, alpha = 0)
Arguments
X |
Input an array with |
m1 |
A positive integer indicating the row factor numbers. |
m2 |
A positive integer indicating the column factor numbers. |
alpha |
A hyper-parameter balancing the information of the first and second moments ( |
Details
For the matrix factor models, Chen & Fan (2021) propose an estimation procedure, i.e. \alpha
-PCA. The method aggregates the information in both first and second moments and extract it via a spectral method. In detail, for observations \bold{X}_t, t=1,2,\cdots,T
, define
\hat{\bold{M}}_R = \frac{1}{p_1 p_2} \left( (1+\alpha) \bar{\bold{X}} \bar{\bold{X}}^\top + \frac{1}{T} \sum_{t=1}^T (\bold{X}_t - \bar{\bold{X}}) (\bold{X}_t - \bar{\bold{X}})^\top \right),
\hat{\bold{M}}_C = \frac{1}{p_1 p_2} \left( (1+\alpha) \bar{\bold{X}}^\top \bar{\bold{X}} + \frac{1}{T} \sum_{t=1}^T (\bold{X}_t - \bar{\bold{X}})^\top (\bold{X}_t - \bar{\bold{X}}) \right),
where \alpha \in
[-1,+\infty
], \bar{\bold{X}} = \frac{1}{T} \sum_{t=1}^T \bold{X}_t
, \frac{1}{T} \sum_{t=1}^T (\bold{X}_t - \bar{\bold{X}}) (\bold{X}_t - \bar{\bold{X}})^\top
and \frac{1}{T} \sum_{t=1}^T (\bold{X}_t - \bar{\bold{X}})^\top (\bold{X}_t - \bar{\bold{X}})
are the sample row and column covariance matrix, respectively. The loading matrices \bold{R}
and \bold{C}
are estimated as \sqrt{p_1}
times the top k_1
eigenvectors of \hat{\bold{M}}_R
and \sqrt{p_2}
times the top k_2
eigenvectors of \hat{\bold{M}}_C
, respectively. For details, see Chen & Fan (2021).
Value
The return value is a list. In this list, it contains the following:
F |
The estimated factor matrix of dimension |
R |
The estimated row loading matrix of dimension |
C |
The estimated column loading matrix of dimension |
Author(s)
Yong He, Changwei Zhao, Ran Zhao.
References
Chen, E. Y., & Fan, J. (2021). Statistical inference for high-dimensional matrix-variate factor models. Journal of the American Statistical Association, 1-18.
Examples
set.seed(11111)
T=20;p1=20;p2=20;k1=3;k2=3
R=matrix(runif(p1*k1,min=-1,max=1),p1,k1)
C=matrix(runif(p2*k2,min=-1,max=1),p2,k2)
X=array(0,c(T,p1,p2))
Y=X;E=Y
F=array(0,c(T,k1,k2))
for(t in 1:T){
F[t,,]=matrix(rnorm(k1*k2),k1,k2)
E[t,,]=matrix(rnorm(p1*p2),p1,p2)
Y[t,,]=R%*%F[t,,]%*%t(C)
}
X=Y+E
#Estimate the factor matrices and loadings
fit=alpha_PCA(X, k1, k2, alpha = 0)
Rhat=fit$R
Chat=fit$C
Fhat=fit$F
#Estimate the common component
CC=array(0,c(T,p1,p2))
for (t in 1:T){
CC[t,,]=Rhat%*%Fhat[t,,]%*%t(Chat)
}
CC