kld_est_kde {kldest} | R Documentation |
Kernel density-based Kullback-Leibler divergence estimation in any dimension
Description
Disclaimer: this function doesn't use binning and/or the fast Fourier transform and hence, it is extremely slow even for moderate datasets. For this reason, it is not exported currently.
Usage
kld_est_kde(X, Y, hX = NULL, hY = NULL, rule = c("Silverman", "Scott"))
Arguments
X , Y |
|
hX , hY |
Positive scalars or length |
rule |
A heuristic for computing arguments
As an alternative, Scott's rule
|
Details
This estimation method approximates the densities of the unknown distributions
P
and Q
by kernel density estimates, using a sample size- and
dimension-dependent bandwidth parameter and a Gaussian kernel. It works for
any number of dimensions but is very slow.
Value
A scalar, the estimated Kullback-Leibler divergence \hat D_{KL}(P||Q)
.
Examples
# KL-D between two samples from 1-D Gaussians:
set.seed(0)
X <- rnorm(100)
Y <- rnorm(100, mean = 1, sd = 2)
kld_gaussian(mu1 = 0, sigma1 = 1, mu2 = 1, sigma2 = 2^2)
kld_est_kde1(X, Y)
kld_est_nn(X, Y)
kld_est_brnn(X, Y)
# KL-D between two samples from 2-D Gaussians:
set.seed(0)
X1 <- rnorm(100)
X2 <- rnorm(100)
Y1 <- rnorm(100)
Y2 <- Y1 + rnorm(100)
X <- cbind(X1,X2)
Y <- cbind(Y1,Y2)
kld_gaussian(mu1 = rep(0,2), sigma1 = diag(2),
mu2 = rep(0,2), sigma2 = matrix(c(1,1,1,2),nrow=2))
kld_est_kde2(X, Y)
kld_est_nn(X, Y)
kld_est_brnn(X, Y)