semiorthogonalize {WALS}R Documentation

Internal function: Semiorthogonal-type transformation of X2 to Z2

Description

Uses the matrix Z2s (called \bar{\Xi} in eq. (9) of De Luca et al. (2018)) to transform \bar{X}_2 to \bar{Z}_2, i.e. to perform \bar{Z}_2 = \bar{X}_2 \bar{\Delta}_2 \bar{\Xi}^{-1/2}. For WALS in the linear regression model, the variables do not have a "bar".

Usage

semiorthogonalize(Z2s, X2, Delta2, SVD = TRUE, postmult = FALSE)

Arguments

Z2s

Matrix for which we take negative square root in X2 * Delta2 * Z2s^{1/2}.

X2

Design matrix of auxiliary regressors to be transformed to Z2

Delta2

Scaling matrix such that diagonal of \bar{\Delta}_2 \bar{X}_2^{\top} \bar{M}_1 \bar{X}_2 \Delta_{2} is one (ignored scaling by n because not needed in code). See De Luca et al. (2018)

SVD

If TRUE, uses svd to compute eigendecomposition of Z2s, otherwise uses eigen.

postmult

If TRUE, then it uses Z2s^{-1/2} = T \Lambda^{-1/2} T^{\top}, where T contains the eigenvectors of Z2s in its columns and \Lambda the corresponding eigenvalues. If FALSE it uses Z2s^{-1/2} = T \Lambda^{-1/2}.

On the "semiorthogonal-type" transformation

For WALS GLM (and WALS in the linear regression model), the transformation is semiorthogonal (ignored scaling by n for clarity and because it is not needed in the code) in the sense that \bar{M}_{1} \bar{Z}_{2} is semiorthogonal since

\bar{Z}_{2}^{\top} \bar{M}_1 \bar{Z}_{2} = (\bar{Z}_{2}^{\top} \bar{M}_1) (\bar{M}_{1} \bar{Z}_{2}) = I_{k_2},

where \bar{M}_1 is an idempotent matrix.

For WALS in the NB2 regression model, \bar{M}_{1} \bar{Z}_{2} is not semiorthogonal anymore due to the rank-1 perturbation in \bar{M}_1 which causes \bar{M}_1 to not be idempotent anymore, see the section "Transformed model" in Huynh (2024a).

On the use of postmult = TRUE

The transformation of the auxiliary regressors Z_2 for linear WALS in eq. (12) of Magnus and De Luca (2016) differs from the transformation for WALS GLM (and WALS NB) in eq. (9) of De Luca et al. (2018):

In Magnus and De Luca (2016) the transformed auxiliary regressors are

Z_{2} = X_2 \Delta_2 T \Lambda^{-1/2},

where T contains the eigenvectors of \Xi = \Delta_2 X_{2}^{\top} M_{1} X_{2} \Delta_2 in the columns and \Lambda the respective eigenvalues. This definition is used when postmult = FALSE.

In contrast, De Luca et al. (2018) defines

Z_2 = X_2 \Delta_2 T \Lambda^{-1/2} T^{\top},

where we ignored scaling by n and the notation with "bar" for easier comparison. This definition is used when postmult = TRUE and is strongly preferred for walsGLM and walsNB.

See Huynh (2024b) for more details.

References

De Luca G, Magnus JR, Peracchi F (2018). “Weighted-average least squares estimation of generalized linear models.” Journal of Econometrics, 204(1), 1–17. doi:10.1016/j.jeconom.2017.12.007.

Huynh K (2024a). “Weighted-Average Least Squares for Negative Binomial Regression.” arXiv 2404.11324, arXiv.org E-Print Archive. doi:10.48550/arXiv.2404.11324.

Huynh K (2024b). “WALS: Weighted-Average Least Squares Model Averaging in R.” University of Basel. Mimeo.

Magnus JR, De Luca G (2016). “Weighted-average least squares (WALS): A survey.” Journal of Economic Surveys, 30(1), 117-148. doi:10.1111/joes.12094.


[Package WALS version 0.2.5 Index]