AdaBound {sjSDM}R Documentation

AdaBound

Description

adaptive gradient methods with dynamic bound of learning rate, see Luo et al., 2019 for details

Usage

AdaBound(
  betas = c(0.9, 0.999),
  final_lr = 0.1,
  gamma = 0.001,
  eps = 1e-08,
  weight_decay = 0,
  amsbound = TRUE
)

Arguments

betas

betas

final_lr

eps

gamma

small_const

eps

eps

weight_decay

weight_decay

amsbound

amsbound

Value

Anonymous function that returns optimizer when called.

References

Luo, L., Xiong, Y., Liu, Y., & Sun, X. (2019). Adaptive gradient methods with dynamic bound of learning rate. arXiv preprint arXiv:1902.09843.


[Package sjSDM version 1.0.5 Index]