kld {mlf}R Documentation

Kullback-Leibler Divergence

Description

Provides estimated difference between individual entropy and cross-entropy of two probability distributions.

Usage

kld(x, y, bins)

Arguments

x, y

numeric or discrete data vectors

bins

specify number of bins

Examples

# Sample numeric vector
a <- rnorm(25, 80, 35)
b <- rnorm(25, 90, 35)
mlf::kld(a, b, bins = 2)

# Sample discrete vector
a <- as.factor(c(1,1,2,2))
b <- as.factor(c(1,1,1,2))
mlf::kld(a, b)

[Package mlf version 1.2.1 Index]