kld_ci_bootstrap {kldest} | R Documentation |
Uncertainty of KL divergence estimate using Efron's bootstrap.
Description
This function computes a confidence interval for KL divergence based on Efron's bootstrap. The approach only works for kernel density-based estimators since nearest neighbour-based estimators cannot deal with the ties produced when sampling with replacement.
Usage
kld_ci_bootstrap(
X,
Y,
estimator = kld_est_kde1,
B = 500L,
alpha = 0.05,
method = c("quantile", "se"),
include.boot = FALSE,
...
)
Arguments
X , Y |
|
estimator |
A function expecting two inputs |
B |
Number of bootstrap replicates (default: |
alpha |
Error level, defaults to |
method |
Either |
include.boot |
Boolean, |
... |
Arguments passed on to |
Details
Reference:
Efron, "Bootstrap Methods: Another Look at the Jackknife", The Annals of Statistics, Vol. 7, No. 1 (1979).
Value
A list with the following fields:
-
"est"
(the estimated KL divergence), -
"boot"
(a lengthB
numeric vector with KL divergence estimates on the bootstrap subsamples), only included ifinclude.boot = TRUE
, -
"ci"
(a length2
vector containing the lower and upper limits of the estimated confidence interval).
Examples
# 1D Gaussian, two samples
set.seed(0)
X <- rnorm(100)
Y <- rnorm(100, mean = 1, sd = 2)
kld_gaussian(mu1 = 0, sigma1 = 1, mu2 = 1, sigma2 = 2^2)
kld_est_kde1(X, Y)
kld_ci_bootstrap(X, Y)