tune {xnet} | R Documentation |
tune the lambda parameters for a tskrr
Description
This function lets you tune the lambda parameter(s) of a two-step
kernel ridge regression model for optimal performance. You can either
tune a previously fitted tskrr
model, or pass the
label matrix and kernel matrices to fit and tune a model in
one go.
Usage
## S4 method for signature 'tskrrHomogeneous'
tune(
x,
lim = c(1e-04, 1),
ngrid = 10,
lambda = NULL,
fun = loss_mse,
exclusion = "edges",
replaceby0 = FALSE,
onedim = TRUE,
...
)
## S4 method for signature 'tskrrHeterogeneous'
tune(
x,
lim = c(1e-04, 1),
ngrid = 10,
lambda = NULL,
fun = loss_mse,
exclusion = "interaction",
replaceby0 = FALSE,
onedim = FALSE,
...
)
## S4 method for signature 'matrix'
tune(
x,
k,
g = NULL,
lim = c(1e-04, 1),
ngrid = 10,
lambda = NULL,
fun = loss_mse,
exclusion = "interaction",
replaceby0 = FALSE,
testdim = TRUE,
testlabels = TRUE,
symmetry = c("auto", "symmetric", "skewed"),
keep = FALSE,
onedim = is.null(g),
...
)
Arguments
x |
a |
lim |
a vector with 2 values that give the boundaries for the domain in which lambda is searched, or possibly a list with 2 elements. See details |
ngrid |
a single numeric value giving the number of points in a single dimension of the grid, or possibly a list with 2 elements. See details. |
lambda |
a vector with the lambdas that need checking for
homogeneous networks, or possibly a list with two elements for
heterogeneous networks. See Details. Defaults to
|
fun |
a loss function that takes the label matrix Y and the result of the crossvalidation LOO as input. The function name can be passed as a character string as well. |
exclusion |
a character value with possible values "interaction", "row", "column", "both" for heterogeneous models, and "edges", "vertices", "interaction" or "both" for homogeneous models. Defaults to "interaction". See details. |
replaceby0 |
a logical value indicating whether the interaction
should be simply removed ( |
onedim |
a logical value indicating whether the search should be done in a single dimension. See details. |
... |
arguments to be passed to the loss function |
k |
a kernel matrix for the rows |
g |
an optional kernel matrix for the columns |
testdim |
a logical value indicating whether symmetry
and the dimensions of the kernel(s) should be tested.
Defaults to |
testlabels |
a logical value indicating wether the row- and column
names of the matrices have to be checked for consistency. Defaults to
|
symmetry |
a character value with the possibilities "auto", "symmetric" or "skewed". In case of a homogeneous fit, you can either specify whether the label matrix is symmetric or skewed, or you can let the function decide (option "auto"). |
keep |
a logical value indicating whether the kernel hat
matrices should be stored in the model object. Doing so makes the
model object quite larger, but can speed up predictions in
some cases. Defaults to |
Details
This function currently only performs a simple grid search for all
(combinations of) lambda values. If no specific lambda values are
provided, then the function uses create_grid
to
create an evenly spaced (on a logarithmic scale) grid.
In the case of a heterogeneous network, you can specify different values
for the two parameters that need tuning. To do so, you need to
provide a list with the settings for every parameter to the arguments
lim
, ngrid
and/or lambda
. If you
try this for a homogeneous network, the function will return an error.
Alternatively, you can speed up the grid search by searching in a
single dimension. When onedim = TRUE
, the search for a
heterogeneous network will only consider cases where both lambda values
are equal.
The arguments exclusion
and replaceby0
are used by
the function get_loo_fun
to find the correct
leave-one-out function.
By default, the function uses standard mean squared error based on
the cross-validation results as a measure for optimization. However, you
can provide a custom function if needed, as long as it takes
two matrices as input: Y
being the observed interactions and
LOO
being the result of the chosen cross-validation.
Value
a model of class tskrrTune
See Also
-
loo
,loo_internal
andget_loo_fun
for more information on how leave one out validation works. -
tskrr
for fitting a twostep kernel ridge regression. -
loss_functions
for different loss functions.
Examples
data(drugtarget)
mod <- tskrr(drugTargetInteraction, targetSim, drugSim)
tuned <- tune(mod, lim = c(0.1,1), ngrid = list(5,10),
fun = loss_auc)
## Not run:
# This is just some visualization of the matrix
# It can be run safely.
gridvals <- get_grid(tuned)
z <- get_loss_values(tuned) # loss values
image(gridvals$k,gridvals$g,z, log = 'xy',
xlab = "lambda k", ylab = "lambda g",
col = rev(heat.colors(20)))
## End(Not run)