kldiv {bayesmeta} | R Documentation |
Kullback-Leibler divergence of two multivariate normal distributions.
Description
Compute the Kullback-Leiber divergence or symmetrized KL-divergence based on means and covariances of two normal distributions.
Usage
kldiv(mu1, mu2, sigma1, sigma2, symmetrized=FALSE)
Arguments
mu1 , mu2 |
the two mean vectors. |
sigma1 , sigma2 |
the two covariance matrices. |
symmetrized |
logical; if |
Details
The Kullback-Leibler divergence (or relative entropy) of two
probability distributions and
is defined as the
integral
In the case of two normal distributions with mean and variance
parameters given by (,
) and
(
,
), respectively, this
results as
where is the dimension.
The symmetrized divergence simply results as
Value
The divergence ( or
).
Author(s)
Christian Roever christian.roever@med.uni-goettingen.de
References
S. Kullback. Information theory and statistics. John Wiley and Sons, New York, 1959.
C. Roever, T. Friede. Discrete approximation of a mixture distribution via restricted divergence. Journal of Computational and Graphical Statistics, 26(1):217-222, 2017. doi:10.1080/10618600.2016.1276840.
See Also
bmr
.
Examples
kldiv(mu1=c(0,0), mu2=c(1,1), sigma1=diag(c(2,2)), sigma2=diag(c(3,3)))