LmNN {deeptrafo} | R Documentation |
Deep normal linear regression
Description
Deep normal linear regression
Usage
LmNN(
formula,
data,
response_type = get_response_type(data[[all.vars(formula)[1]]]),
order = get_order(response_type, data[[all.vars(formula)[1]]]),
addconst_interaction = 0,
latent_distr = "normal",
monitor_metrics = NULL,
trafo_options = trafo_control(order_bsp = 1L, response_type = response_type,
y_basis_fun = eval_lin, y_basis_fun_lower = .empty_fun(eval_lin), y_basis_fun_prime =
eval_lin_prime, basis = "shiftscale"),
...
)
Arguments
formula |
Formula specifying the response, interaction, shift terms
as |
data |
Named |
response_type |
Character; type of response. One of |
order |
Integer; order of the response basis. Default 10 for Bernstein basis or number of levels minus one for ordinal responses. |
addconst_interaction |
Positive constant;
a constant added to the additive predictor of the interaction term.
If |
latent_distr |
A |
monitor_metrics |
See |
trafo_options |
Options for transformation models such as the basis
function used, see |
... |
Additional arguments passed to |
Value
See return statement of deeptrafo
Examples
set.seed(1)
df <- data.frame(y = 10 + rnorm(50), x = rnorm(50))
if (reticulate::py_module_available("tensorflow") &
reticulate::py_module_available("keras") &
reticulate::py_module_available("tensorflow_probability")) {
m <- LmNN(y ~ 0 + x, data = df)
optimizer <- optimizer_adam(learning_rate = 0.01, decay = 4e-4)
m <- LmNN(y ~ 0 + x, data = df, optimizer = optimizer)
library(tram)
fit(m, epochs = 900L, validation_split = 0)
logLik(mm <- Lm(y ~ x, data = df)); logLik(m)
coef(mm, with_baseline = TRUE); unlist(c(coef(m, which = "interacting"),
coef(m, which = "shifting")))
}