KL.GBN {bnmonitor} | R Documentation |

## KL Divergence for `GBN`

### Description

`KL.GBN`

returns the Kullback-Leibler (KL) divergence between an object of class `GBN`

and its update after a standard parameter variation.

### Usage

```
## S3 method for class 'GBN'
KL(x, where, entry, delta, ...)
```

### Arguments

`x` |
object of class |

`where` |
character string: either |

`entry` |
if |

`delta` |
numeric vector, including the variation parameters that act additively. |

`...` |
additional arguments for compatibility. |

### Details

Computation of the KL divergence between a Bayesian network and the additively perturbed Bayesian network, where the perturbation is either to the mean vector or to the covariance matrix.

### Value

A dataframe including in the first column the variations performed and in the second column the corresponding KL divergences.

### References

Gómez-Villegas, M. A., Maín, P., & Susi, R. (2007). Sensitivity analysis in Gaussian Bayesian networks using a divergence measure. Communications in Statistics—Theory and Methods, 36(3), 523-539.

Gómez-Villegas, M. A., Main, P., & Susi, R. (2013). The effect of block parameter perturbations in Gaussian Bayesian networks: Sensitivity and robustness. Information Sciences, 222, 439-458.

### See Also

`KL.CI`

, `Fro.CI`

, `Fro.GBN`

, `Jeffreys.GBN`

, `Jeffreys.CI`

### Examples

```
KL(synthetic_gbn,"mean",2,seq(-1,1,0.1))
KL(synthetic_gbn,"covariance",c(3,3),seq(-1,1,0.1))
```

*bnmonitor*version 0.1.4 Index]