ddml_plm {ddml} | R Documentation |
Estimator for the Partially Linear Model.
Description
Estimator for the partially linear model.
Usage
ddml_plm(
y,
D,
X,
learners,
learners_DX = learners,
sample_folds = 2,
ensemble_type = "nnls",
shortstack = FALSE,
cv_folds = 5,
custom_ensemble_weights = NULL,
custom_ensemble_weights_DX = custom_ensemble_weights,
subsamples = NULL,
cv_subsamples_list = NULL,
silent = FALSE
)
Arguments
y |
The outcome variable. |
D |
A matrix of endogenous variables. |
X |
A (sparse) matrix of control variables. |
learners |
May take one of two forms, depending on whether a single
learner or stacking with multiple learners is used for estimation of the
conditional expectation functions.
If a single learner is used,
If stacking with multiple learners is used,
Omission of the |
learners_DX |
Optional argument to allow for different estimators of
|
sample_folds |
Number of cross-fitting folds. |
ensemble_type |
Ensemble method to combine base learners into final estimate of the conditional expectation functions. Possible values are:
Multiple ensemble types may be passed as a vector of strings. |
shortstack |
Boolean to use short-stacking. |
cv_folds |
Number of folds used for cross-validation in ensemble construction. |
custom_ensemble_weights |
A numerical matrix with user-specified
ensemble weights. Each column corresponds to a custom ensemble
specification, each row corresponds to a base learner in |
custom_ensemble_weights_DX |
Optional argument to allow for different
custom ensemble weights for |
subsamples |
List of vectors with sample indices for cross-fitting. |
cv_subsamples_list |
List of lists, each corresponding to a subsample containing vectors with subsample indices for cross-validation. |
silent |
Boolean to silence estimation updates. |
Details
ddml_plm
provides a double/debiased machine learning
estimator for the parameter of interest \theta_0
in the partially
linear model given by
Y = \theta_0D + g_0(X) + U,
where (Y, D, X, U)
is a random vector such that
E[Cov(U, D\vert X)] = 0
and E[Var(D\vert X)] \neq 0
, and
g_0
is an unknown nuisance function.
Value
ddml_plm
returns an object of S3 class
ddml_plm
. An object of class ddml_plm
is a list containing
the following components:
coef
A vector with the
\theta_0
estimates.weights
A list of matrices, providing the weight assigned to each base learner (in chronological order) by the ensemble procedure.
mspe
A list of matrices, providing the MSPE of each base learner (in chronological order) computed by the cross-validation step in the ensemble construction.
ols_fit
Object of class
lm
from the second stage regression ofY - \hat{E}[Y|X]
onD - \hat{E}[D|X]
.learners
,learners_DX
,subsamples
,cv_subsamples_list
,ensemble_type
Pass-through of selected user-provided arguments. See above.
References
Ahrens A, Hansen C B, Schaffer M E, Wiemann T (2023). "ddml: Double/debiased machine learning in Stata." https://arxiv.org/abs/2301.09397
Chernozhukov V, Chetverikov D, Demirer M, Duflo E, Hansen C B, Newey W, Robins J (2018). "Double/debiased machine learning for treatment and structural parameters." The Econometrics Journal, 21(1), C1-C68.
Wolpert D H (1992). "Stacked generalization." Neural Networks, 5(2), 241-259.
See Also
Other ddml:
ddml_ate()
,
ddml_fpliv()
,
ddml_late()
,
ddml_pliv()
Examples
# Construct variables from the included Angrist & Evans (1998) data
y = AE98[, "worked"]
D = AE98[, "morekids"]
X = AE98[, c("age","agefst","black","hisp","othrace","educ")]
# Estimate the partially linear model using a single base learner, ridge.
plm_fit <- ddml_plm(y, D, X,
learners = list(what = mdl_glmnet,
args = list(alpha = 0)),
sample_folds = 2,
silent = TRUE)
summary(plm_fit)
# Estimate the partially linear model using short-stacking with base learners
# ols, lasso, and ridge. We can also use custom_ensemble_weights
# to estimate the ATE using every individual base learner.
weights_everylearner <- diag(1, 3)
colnames(weights_everylearner) <- c("mdl:ols", "mdl:lasso", "mdl:ridge")
plm_fit <- ddml_plm(y, D, X,
learners = list(list(fun = ols),
list(fun = mdl_glmnet),
list(fun = mdl_glmnet,
args = list(alpha = 0))),
ensemble_type = 'nnls',
custom_ensemble_weights = weights_everylearner,
shortstack = TRUE,
sample_folds = 2,
silent = TRUE)
summary(plm_fit)