cv.aglm {aglm}R Documentation

Fit an AGLM model with cross-validation for \lambda

Description

A fitting function with given \alpha and cross-validation for \lambda. See aglm-package for more details on \alpha and \lambda.

Usage

cv.aglm(
  x,
  y,
  qualitative_vars_UD_only = NULL,
  qualitative_vars_both = NULL,
  qualitative_vars_OD_only = NULL,
  quantitative_vars = NULL,
  use_LVar = FALSE,
  extrapolation = "default",
  add_linear_columns = TRUE,
  add_OD_columns_of_qualitatives = TRUE,
  add_interaction_columns = FALSE,
  OD_type_of_quantitatives = "C",
  nbin.max = NULL,
  bins_list = NULL,
  bins_names = NULL,
  family = c("gaussian", "binomial", "poisson"),
  keep = FALSE,
  ...
)

Arguments

x

A design matrix. See aglm for more details.

y

A response variable.

qualitative_vars_UD_only

Same as in aglm.

qualitative_vars_both

Same as in aglm.

qualitative_vars_OD_only

Same as in aglm.

quantitative_vars

Same as in aglm.

use_LVar

Same as in aglm.

extrapolation

Same as in aglm.

add_linear_columns

Same as in aglm.

add_OD_columns_of_qualitatives

Same as in aglm.

add_interaction_columns

Same as in aglm.

OD_type_of_quantitatives

Same as in aglm.

nbin.max

Same as in aglm.

bins_list

Same as in aglm.

bins_names

Same as in aglm.

family

Same as in aglm.

keep

Set to TRUE if you need the fit.preval field in the returned value, as in cv.glmnet().

...

Other arguments are passed directly when calling cv.glmnet().

Value

A model object fitted to the data with cross-validation results. Functions such as predict and plot can be applied to the returned object, same as the result of aglm(). See AccurateGLM-class for more details.

Author(s)

References

Suguru Fujita, Toyoto Tanaka, Kenji Kondo and Hirokazu Iwasawa. (2020) AGLM: A Hybrid Modeling Method of GLM and Data Science Techniques,
https://www.institutdesactuaires.com/global/gene/link.php?doc_id=16273&fg=1
Actuarial Colloquium Paris 2020

Examples


#################### Cross-validation for lambda ####################

library(aglm)
library(faraway)

## Read data
xy <- nes96

## Split data into train and test
n <- nrow(xy) # Sample size.
set.seed(2018) # For reproducibility.
test.id <- sample(n, round(n/5)) # ID numbders for test data.
test <- xy[test.id,] # test is the data.frame for testing.
train <- xy[-test.id,] # train is the data.frame for training.
x <- train[, c("popul", "TVnews", "selfLR", "ClinLR", "DoleLR", "PID", "age", "educ", "income")]
y <- train$vote
newx <- test[, c("popul", "TVnews", "selfLR", "ClinLR", "DoleLR", "PID", "age", "educ", "income")]

# NOTE: Codes bellow will take considerable time, so run it when you have time.


## Fit the model
model <- cv.aglm(x, y, family="binomial")

## Make the confusion matrix
lambda <- model@lambda.min
y_true <- test$vote
y_pred <- levels(y_true)[as.integer(predict(model, newx, s=lambda, type="class"))]

cat(sprintf("Confusion matrix for lambda=%.5f:\n", lambda))
print(table(y_true, y_pred))



[Package aglm version 0.4.0 Index]