KL {bnlearn}R Documentation

Compute the distance between two fitted Bayesian networks

Description

Compute the Kullback-Leibler divergence between two fitted Bayesian networks.

Usage

KL(P, Q)

Arguments

P, Q

two objects of class bn.fit.

Value

KL() returns a numeric value.

Note

KL() only supports discrete (bn.fit.dnet) and Gaussian (bn.fit.gnet) networks. Note that in the case of Gaussian netwoks the divergence can be negative. Regardless of the type of network, if at least one of the two networks is singular the divergence can be +Inf.

If any of the parameters of the two networks are NAs, the divergence will also be NA.

Author(s)

Marco Scutari

Examples

# discrete networks
dag = model2network("[A][C][F][B|A][D|A:C][E|B:F]")
fitted1 = bn.fit(dag, learning.test, method = "mle")
fitted2 = bn.fit(dag, learning.test, method = "bayes", iss = 20)

KL(fitted1, fitted1)
KL(fitted2, fitted2)
KL(fitted1, fitted2)

# continuous, singular networks.
dag = model2network("[A][B][E][G][C|A:B][D|B][F|A:D:E:G]")
singular = fitted1 = bn.fit(dag, gaussian.test)
singular$A = list(coef = coef(fitted1[["A"]]) + runif(1), sd = 0)

KL(singular, fitted1)
KL(fitted1, singular)

[Package bnlearn version 4.9.3 Index]