ComputeKLDs {BayesNetBP}R Documentation

Compute signed and symmetric Kullback-Leibler divergence

Description

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence

Usage

ComputeKLDs(
  tree,
  var0,
  vars,
  seq,
  pbar = TRUE,
  method = "gaussian",
  epsilon = 10^-6
)

Arguments

tree

a ClusterTree object

var0

the variable to have evidence absrobed

vars

the variables to have divergence computed

seq

a vector of numeric values as the evidences

pbar

logical(1) whether to show progress bar

method

method for divergence computation: gaussian for Gaussian approximation, for Monte Carlo integration

epsilon

numeric(1) the KL divergence is undefined if certain states of a discrete variable have probabilities of 0. In this case, a small positive number epsilon is assigned as their probabilities for calculating the divergence. The probabilities of other states are shrunked proportionally to ensure they sum up to 1.

Details

Compute signed and symmetric Kullback-Leibler divergence of variables over a spectrum of evidence. The signed and symmetric Kullback-Leibler divergence is also known as Jeffery's signed information (JSI) for continuous variables.

Value

a data.frame of the divergence

Author(s)

Han Yu

References

Cowell, R. G. (2005). Local propagation in conditional Gaussian Bayesian networks. Journal of Machine Learning Research, 6(Sep), 1517-1550.

Yu H, Moharil J, Blair RH (2020). BayesNetBP: An R Package for Probabilistic Reasoning in Bayesian Networks. Journal of Statistical Software, 94(3), 1-31. <doi:10.18637/jss.v094.i03>.

Examples

## Not run: 
data(liver)
tree.init.p <- Initializer(dag=liver$dag, data=liver$data,
                           node.class=liver$node.class,
                           propagate = TRUE)
klds <- ComputeKLDs(tree=tree.init.p, var0="Nr1i3",
                    vars=setdiff(tree.init.p@node, "Nr1i3"),
                    seq=seq(-3,3,0.5))
head(klds)

## End(Not run)

[Package BayesNetBP version 1.6.1 Index]