loss_kl_divergence {keras3}R Documentation

Computes Kullback-Leibler divergence loss between y_true & y_pred.

Description

Formula:

loss <- y_true * log(y_true / y_pred)

Usage

loss_kl_divergence(
  y_true,
  y_pred,
  ...,
  reduction = "sum_over_batch_size",
  name = "kl_divergence"
)

Arguments

y_true

Tensor of true targets.

y_pred

Tensor of predicted targets.

...

For forward/backward compatability.

reduction

Type of reduction to apply to the loss. In almost all cases this should be "sum_over_batch_size". Supported options are "sum", "sum_over_batch_size" or NULL.

name

Optional name for the loss instance.

Value

KL Divergence loss values with shape = ⁠[batch_size, d0, .. dN-1]⁠.

Examples

y_true <- random_uniform(c(2, 3), 0, 2)
y_pred <- random_uniform(c(2,3))
loss <- loss_kl_divergence(y_true, y_pred)
loss
## tf.Tensor([3.5312676 0.2128672], shape=(2), dtype=float32)

See Also

Other losses:
Loss()
loss_binary_crossentropy()
loss_binary_focal_crossentropy()
loss_categorical_crossentropy()
loss_categorical_focal_crossentropy()
loss_categorical_hinge()
loss_cosine_similarity()
loss_dice()
loss_hinge()
loss_huber()
loss_log_cosh()
loss_mean_absolute_error()
loss_mean_absolute_percentage_error()
loss_mean_squared_error()
loss_mean_squared_logarithmic_error()
loss_poisson()
loss_sparse_categorical_crossentropy()
loss_squared_hinge()
metric_binary_crossentropy()
metric_binary_focal_crossentropy()
metric_categorical_crossentropy()
metric_categorical_focal_crossentropy()
metric_categorical_hinge()
metric_hinge()
metric_huber()
metric_kl_divergence()
metric_log_cosh()
metric_mean_absolute_error()
metric_mean_absolute_percentage_error()
metric_mean_squared_error()
metric_mean_squared_logarithmic_error()
metric_poisson()
metric_sparse_categorical_crossentropy()
metric_squared_hinge()


[Package keras3 version 0.2.0 Index]