CV.search.DPDU.regression {changepoints} | R Documentation |
Grid search based on cross-validation of dynamic programming for regression change points localisation with l_0
penalisation.
Description
Perform grid search to select tuning parameters gamma (for l_0
penalty of DP) and lambda (for lasso penalty) based on cross-validation.
Usage
CV.search.DPDU.regression(y, X, lambda_set, zeta_set, eps = 0.001)
Arguments
y |
A |
X |
A |
lambda_set |
A |
zeta_set |
An |
eps |
A |
Value
A list
with the following structure:
cpt_hat |
A list of vectors of estimated change points |
K_hat |
A list of scalars of number of estimated change points |
test_error |
A matrix of testing errors (each row corresponding to each gamma, and each column corresponding to each lambda) |
train_error |
A matrix of training errors |
beta_hat |
A list of matrices of estimated regression coefficients |
Author(s)
Haotian Xu
References
Xu, Wang, Zhao and Yu (2022) <arXiv:2207.12453>.
Examples
d0 = 5
p = 30
n = 200
cpt_true = 100
data = simu.change.regression(d0, cpt_true, p, n, sigma = 1, kappa = 9)
lambda_set = c(0.01, 0.1, 1, 2)
zeta_set = c(10, 15, 20)
temp = CV.search.DPDU.regression(y = data$y, X = data$X, lambda_set, zeta_set)
temp$test_error # test error result
# find the indices of lambda_set and zeta_set which minimizes the test error
min_idx = as.vector(arrayInd(which.min(temp$test_error), dim(temp$test_error)))
lambda_set[min_idx[2]]
zeta_set[min_idx[1]]
cpt_init = unlist(temp$cpt_hat[min_idx[1], min_idx[2]])
beta_hat = matrix(unlist(temp$beta_hat[min_idx[1], min_idx[2]]), ncol = length(cpt_init)+1)