get_pycox_optim {survivalmodels} | R Documentation |
Get Pytorch Optimizer
Description
Helper function to return a constructed pytorch optimizer from torch.optim
.
Usage
get_pycox_optim(
optimizer = "adam",
net,
rho = 0.9,
eps = 1e-08,
lr = 1,
weight_decay = 0,
learning_rate = 0.01,
lr_decay = 0,
betas = c(0.9, 0.999),
amsgrad = FALSE,
lambd = 1e-04,
alpha = 0.75,
t0 = 1e+06,
momentum = 0,
centered = TRUE,
etas = c(0.5, 1.2),
step_sizes = c(1e-06, 50),
dampening = 0,
nesterov = FALSE
)
Arguments
optimizer |
|
net |
|
rho , lr , lr_decay |
|
eps |
|
weight_decay |
|
learning_rate |
|
betas |
|
amsgrad |
|
lambd , t0 |
|
alpha |
|
momentum |
|
centered |
|
etas , step_sizes |
|
dampening |
|
nesterov |
|
Details
Implemented methods (with help pages) are
-
"adadelta"
reticulate::py_help(torch$optim$Adadelta)
-
"adagrad"
reticulate::py_help(torch$optim$Adagrad)
-
"adam"
reticulate::py_help(torch$optim$Adam)
-
"adamax"
reticulate::py_help(torch$optim$Adamax)
-
"adamw"
reticulate::py_help(torch$optim$AdamW)
-
"asgd"
reticulate::py_help(torch$optim$ASGD)
-
"rmsprop"
reticulate::py_help(torch$optim$RMSprop)
-
"rprop"
reticulate::py_help(torch$optim$Rprop)
-
"sgd"
reticulate::py_help(torch$optim$SGD)
-
"sparse_adam"
reticulate::py_help(torch$optim$SparseAdam)
Value
No return value.