fit_sgs_cv {sgs} | R Documentation |
Fit an SGS model using k-fold cross-validation.
Description
Function to fit a pathwise solution of sparse-group SLOPE (SGS) models using k-fold cross-validation. Supports both linear and logistic regression, both with dense and sparse matrix implementations.
Usage
fit_sgs_cv(
X,
y,
groups,
type = "linear",
lambda = "path",
path_length = 20,
min_frac = 0.05,
alpha = 0.95,
vFDR = 0.1,
gFDR = 0.1,
pen_method = 1,
nfolds = 10,
backtracking = 0.7,
max_iter = 5000,
max_iter_backtracking = 100,
tol = 1e-05,
standardise = "l2",
intercept = TRUE,
error_criteria = "mse",
screen = TRUE,
verbose = FALSE,
v_weights = NULL,
w_weights = NULL
)
Arguments
X |
Input matrix of dimensions |
y |
Output vector of dimension |
groups |
A grouping structure for the input data. Should take the form of a vector of group indices. |
type |
The type of regression to perform. Supported values are: |
lambda |
The regularisation parameter. Defines the level of sparsity in the model. A higher value leads to sparser models:
|
path_length |
The number of |
min_frac |
Defines the termination point of the pathwise solution, so that |
alpha |
The value of |
vFDR |
Defines the desired variable false discovery rate (FDR) level, which determines the shape of the variable penalties. Must be between 0 and 1. |
gFDR |
Defines the desired group false discovery rate (FDR) level, which determines the shape of the group penalties. Must be between 0 and 1. |
pen_method |
The type of penalty sequences to use (see Feser and Evangelou (2023)):
|
nfolds |
The number of folds to use in cross-validation. |
backtracking |
The backtracking parameter, |
max_iter |
Maximum number of ATOS iterations to perform. |
max_iter_backtracking |
Maximum number of backtracking line search iterations to perform per global iteration. |
tol |
Convergence tolerance for the stopping criteria. |
standardise |
Type of standardisation to perform on
|
intercept |
Logical flag for whether to fit an intercept. |
error_criteria |
The criteria used to discriminate between models along the path. Supported values are: |
screen |
Logical flag for whether to apply screening rules (see Feser and Evangelou (2024)). Screening discards irrelevant groups before fitting, greatly improving speed. |
verbose |
Logical flag for whether to print fitting information. |
v_weights |
Optional vector for the variable penalty weights. Overrides the penalties from pen_method if specified. When entering custom weights, these are multiplied internally by |
w_weights |
Optional vector for the group penalty weights. Overrides the penalties from pen_method if specified. When entering custom weights, these are multiplied internally by |
Details
Fits SGS models under a pathwise solution using adaptive three operator splitting (ATOS), picking the 1se model as optimum. Warm starts are implemented.
Value
A list containing:
all_models |
A list of all the models fitted along the path. |
fit |
The 1se chosen model, which is a |
best_lambda |
The value of |
best_lambda_id |
The path index for the chosen model. |
errors |
A table containing fitting information about the models on the path. |
type |
Indicates which type of regression was performed. |
References
Feser, F., Evangelou, M. (2023). Sparse-group SLOPE: adaptive bi-level selection with FDR-control, https://arxiv.org/abs/2305.09467
Feser, F., Evangelou, M. (2024). Strong screening rules for group-based SLOPE models, https://arxiv.org/abs/2405.15357
See Also
Other model-selection:
as_sgs()
,
fit_gslope_cv()
,
scaled_sgs()
Other SGS-methods:
as_sgs()
,
coef.sgs()
,
fit_sgs()
,
plot.sgs()
,
predict.sgs()
,
print.sgs()
,
scaled_sgs()
Examples
# specify a grouping structure
groups = c(1,1,1,2,2,3,3,3,4,4)
# generate data
data = gen_toy_data(p=10, n=5, groups = groups, seed_id=3,group_sparsity=1)
# run SGS with cross-validation (the proximal functions can be found in utils.R)
cv_model = fit_sgs_cv(X = data$X, y = data$y, groups=groups, type = "linear",
path_length = 5, nfolds=5, alpha = 0.95, vFDR = 0.1, gFDR = 0.1, min_frac = 0.05,
standardise="l2",intercept=TRUE,verbose=TRUE)