ComputeKLDs {BayesNetBP}  R Documentation 
Compute signed and symmetric KullbackLeibler divergence
Description
Compute signed and symmetric KullbackLeibler divergence of variables over a spectrum of evidence
Usage
ComputeKLDs(
tree,
var0,
vars,
seq,
pbar = TRUE,
method = "gaussian",
epsilon = 10^6
)
Arguments
tree 
a 
var0 
the variable to have evidence absrobed 
vars 
the variables to have divergence computed 
seq 
a 
pbar 

method 
method for divergence computation:

epsilon 

Details
Compute signed and symmetric KullbackLeibler divergence of variables over a spectrum of evidence. The signed and symmetric KullbackLeibler divergence is also known as Jeffery's signed information (JSI) for continuous variables.
Value
a data.frame
of the divergence
Author(s)
Han Yu
References
Cowell, R. G. (2005). Local propagation in conditional Gaussian Bayesian networks.
Journal of Machine Learning Research, 6(Sep), 15171550.
Yu H, Moharil J, Blair RH (2020). BayesNetBP: An R Package for Probabilistic Reasoning in Bayesian
Networks. Journal of Statistical Software, 94(3), 131. <doi:10.18637/jss.v094.i03>.
Examples
## Not run:
data(liver)
tree.init.p < Initializer(dag=liver$dag, data=liver$data,
node.class=liver$node.class,
propagate = TRUE)
klds < ComputeKLDs(tree=tree.init.p, var0="Nr1i3",
vars=setdiff(tree.init.p@node, "Nr1i3"),
seq=seq(3,3,0.5))
head(klds)
## End(Not run)