kld {NetworkToolbox} | R Documentation |
Kullback-Leibler Divergence
Description
Estimates the Kullback-Leibler Divergence which measures how one probability distribution diverges from the original distribution (equivalent means are assumed) Matrices must be positive definite inverse covariance matrix for accurate measurement. This is a relative metric
Usage
kld(base, test)
Arguments
base |
Full or base model |
test |
Reduced or testing model |
Value
A value greater than 0. Smaller values suggest the probability distribution of the reduced model is near the full model
Author(s)
Alexander Christensen <alexpaulchristensen@gmail.com>
References
Kullback, S., & Leibler, R. A. (1951). On information and sufficiency. The Annals of Mathematical Statistics, 22, 79-86.
Examples
A1 <- solve(cov(neoOpen))
## Not run:
A2 <- LoGo(neoOpen)
kld_value <- kld(A1, A2)
## End(Not run)
[Package NetworkToolbox version 1.4.2 Index]