kld_est_kde1 {kldest}R Documentation

1-D kernel density-based estimation of Kullback-Leibler divergence

Description

This estimation method approximates the densities of the unknown distributions P and Q by a kernel density estimate using function 'density' from package 'stats'. Only the two-sample, not the one-sample problem is implemented.

Usage

kld_est_kde1(X, Y, MC = FALSE, ...)

Arguments

X, Y

Numeric vectors or single-column matrices, representing samples from the true distribution P and the approximate distribution Q, respectively.

MC

A boolean: use a Monte Carlo approximation instead of numerical integration via the trapezoidal rule (default: FALSE)?

...

Further parameters to passed on to stats::density (e.g., argument bw)

Value

A scalar, the estimated Kullback-Leibler divergence \hat D_{KL}(P||Q).

Examples

# KL-D between two samples from 1D Gaussians:
set.seed(0)
X <- rnorm(100)
Y <- rnorm(100, mean = 1, sd = 2)
kld_gaussian(mu1 = 0, sigma1 = 1, mu2 = 1, sigma2 = 2^2)
kld_est_kde1(X,Y)
kld_est_kde1(X,Y, MC = TRUE)

[Package kldest version 1.0.0 Index]