kld_est_kde2 {kldest} | R Documentation |
2-D kernel density-based estimation of Kullback-Leibler divergence
Description
This estimation method approximates the densities of the unknown bivariate
distributions P
and Q
by kernel density estimates using function
'bkde' from package 'KernSmooth'. If 'KernSmooth' is not installed, a message
is issued and the (much) slower function 'kld_est_kde' is used instead.
Usage
kld_est_kde2(
X,
Y,
MC = FALSE,
hX = NULL,
hY = NULL,
rule = c("Silverman", "Scott"),
eps = 1e-05
)
Arguments
X , Y |
|
MC |
A boolean: use a Monte Carlo approximation instead of numerical
integration via the trapezoidal rule (default: |
hX , hY |
Bandwidths for the kernel density estimates of |
rule |
A heuristic to derive parameters
|
eps |
A nonnegative scalar; if |
Value
A scalar, the estimated Kullback-Leibler divergence \hat D_{KL}(P||Q)
.
Examples
# KL-D between two samples from 2-D Gaussians:
set.seed(0)
X1 <- rnorm(1000)
X2 <- rnorm(1000)
Y1 <- rnorm(1000)
Y2 <- Y1 + rnorm(1000)
X <- cbind(X1,X2)
Y <- cbind(Y1,Y2)
kld_gaussian(mu1 = rep(0,2), sigma1 = diag(2),
mu2 = rep(0,2), sigma2 = matrix(c(1,1,1,2),nrow=2))
kld_est_kde2(X,Y)