plot.cv.npmr {npmr} | R Documentation |
Visualize the regression coefficient matrix fit by cross-validated NPMR
Description
Plots features (in orange) by their weights on the first two latent variables in the singular value decomposition of the regression coefficient matrix. Plots response classes (as blue arrows) by their loadings on the first two latent variables. Does this for the regression coefficient matrix fit with the value of lambda that led to the minimum cross validation error among all those tried.
Usage
## S3 method for class 'cv.npmr'
plot(x, feature.names = TRUE, ...)
Arguments
x |
an object of class |
feature.names |
logical. Should the names of the covariates be used in the plot? If FALSE,
use standard plotting symbol ( |
... |
additional arguments to be passed to |
Author(s)
Scott Powers, Trevor Hastie, Rob Tibshirani
References
Scott Powers, Trevor Hastie and Rob Tibshirani (2016). “Nuclear penalized multinomial regression with an application to predicting at bat outcomes in baseball.” In prep.
See Also
Examples
# Fit NPMR to simulated data
K = 5
n = 1000
m = 10000
p = 10
r = 2
# Simulated training data
set.seed(8369)
A = matrix(rnorm(p*r), p, r)
C = matrix(rnorm(K*r), K, r)
B = tcrossprod(A, C) # low-rank coefficient matrix
X = matrix(rnorm(n*p), n, p) # covariate matrix with iid Gaussian entries
eta = X
P = exp(eta)/rowSums(exp(eta))
Y = t(apply(P, 1, rmultinom, n = 1, size = 1))
fold = sample(rep(1:10, length = nrow(X)))
# Simulate test data
Xtest = matrix(rnorm(m*p), m, p)
etatest = Xtest
Ptest = exp(etatest)/rowSums(exp(etatest))
Ytest = t(apply(Ptest, 1, rmultinom, n = 1, size = 1))
# Fit NPMR for a sequence of lambda values without CV:
fit2 = cv.npmr(X, Y, lambda = exp(seq(7, -2)), foldid = fold)
# Produce a biplot:
plot(fit2)