ko.sel {kosel} | R Documentation |
Variable selection with the knockoffs procedure.
Description
Performs variable selection from an object (vector of statistics W) returned by ko.glm
or ko.ordinal
.
Usage
ko.sel(W, print = FALSE, method = "stats")
Arguments
W |
A vector of length nvars corresponding to the statistics W. Object returned by the functions |
print |
Logical. If |
method |
Can be |
Value
A list containing two elements:
-
threshold
A positive real value corresponding to the threshold used. -
estimation
A binary vector of length nvars corresponding to the variable selection: 1*(W >= threshold). 1 indicates that the associated covariate belongs to the estimated model.
References
Gegout-Petit Anne, Gueudin Aurelie, Karmann Clemence (2019). The revisited knockoffs method for variable selection in L1-penalised regressions, arXiv:1907.03153.
See Also
Examples
library(graphics)
# linear Gaussian regression
n = 100
p = 20
set.seed(11)
x = matrix(rnorm(n*p),nrow = n,ncol = p)
beta = c(rep(1,5),rep(0,15))
y = x%*%beta + rnorm(n)
W = ko.glm(x,y)
ko.sel(W, print = TRUE)
# logistic regression
n = 100
p = 20
set.seed(11)
x = matrix(runif(n*p, -1,1),nrow = n,ncol = p)
u = runif(n)
beta = c(c(3:1),rep(0,17))
y = rep(0, n)
a = 1/(1+exp(0.1-x%*%beta))
y = 1*(u>a)
W = ko.glm(x,y, family = 'binomial', nVal = 50)
ko.sel(W, print = TRUE)
# cumulative logit regression
n = 100
p = 10
set.seed(11)
x = matrix(runif(n*p),nrow = n,ncol = p)
u = runif(n)
beta = c(3,rep(0,9))
y = rep(0, n)
a = 1/(1+exp(0.8-x%*%beta))
b = 1/(1+exp(-0.6-x%*%beta))
y = 1*(u<a) + 2*((u>=a) & (u<b)) + 3*(u>=b)
W = ko.ordinal(x,as.factor(y), nVal = 20)
ko.sel(W, print = TRUE)
# adjacent logit regression
n = 100
p = 10
set.seed(11)
x = matrix(rnorm(n*p),nrow = n,ncol = p)
U = runif(n)
beta = c(5,rep(0,9))
alpha = c(-2,1.5)
M = 2
y = rep(0, n)
for(i in 1:n){
eta = alpha + sum(beta*x[i,])
u = U[i]
Prob = rep(1,M+1)
for(j in 1:M){
Prob[j] = exp(sum(eta[j:M]))
}
Prob = Prob/sum(Prob)
C = cumsum(Prob)
C = c(0,C)
j = 1
while((C[j]> u) || (u >= C[j+1])){j = j+1}
y[i] = j
}
W = ko.ordinal(x,as.factor(y), family = 'acat', nVal = 10)
ko.sel(W, method = 'manual')
0.4
# How to use randomness?
n = 100
p = 20
set.seed(11)
x = matrix(rnorm(n*p),nrow = n,ncol = p)
beta = c(5:1,rep(0,15))
y = x%*%beta + rnorm(n)
Esti = 0
for(i in 1:100){
W = ko.glm(x,y, random = TRUE)
Esti = Esti + ko.sel(W, method = 'gaps')$estimation
}
Esti