cv.KLR {calibrateBinary} R Documentation

## K-fold cross-validation for Kernel Logistic Regression

### Description

The function performs k-fold cross validation for kernel logistic regression to estimate tuning parameters.

### Usage

cv.KLR(X, y, K = 5, lambda = seq(0.001, 0.2, 0.005), kernel = c("matern",
"exponential"), nu = 1.5, power = 1.95, rho = seq(0.05, 0.5, 0.05))


### Arguments

 X input for KLR. y input for KLR. K a positive integer specifying the number of folds. The default is 5. lambda a vector specifying lambda values at which CV curve will be computed. kernel input for KLR. nu input for KLR. power input for KLR. rho rho value at which CV curve will be computed.

### Details

This function performs the k-fold cross-valibration for a kernel logistic regression. The CV curve is computed at the values of the tuning parameters assigned by lambda and rho. The number of fold is given by K.

### Value

 lambda value of lambda that gives minimum CV error. rho value of rho that gives minimum CV error.

### Author(s)

Chih-Li Sung <iamdfchile@gmail.com>

KLR for performing a kernel logistic regression with given lambda and rho.

### Examples

library(calibrateBinary)

set.seed(1)
np <- 10
xp <- seq(0,1,length.out = np)
eta_fun <- function(x) exp(exp(-0.5*x)*cos(3.5*pi*x)-1) # true probability function
eta_x <- eta_fun(xp)
yp <- rep(0,np)
for(i in 1:np) yp[i] <- rbinom(1,1, eta_x[i])

x.test <- seq(0,1,0.001)
etahat <- KLR(xp,yp,x.test)

plot(xp,yp)
curve(eta_fun, col = "blue", lty = 2, add = TRUE)
lines(x.test, etahat, col = 2)

#####   cross-validation with K=5    #####
##### to determine the parameter rho #####

cv.out <- cv.KLR(xp,yp,K=5)
print(cv.out)

etahat.cv <- KLR(xp,yp,x.test,lambda=cv.out$lambda,rho=cv.out$rho)

plot(xp,yp)
curve(eta_fun, col = "blue", lty = 2, add = TRUE)
lines(x.test, etahat, col = 2)
lines(x.test, etahat.cv, col = 3)



[Package calibrateBinary version 0.1 Index]