kldcauchy {mcauchyd}R Documentation

Kullback-Leibler Divergence between Centered Multivariate Cauchy Distributions

Description

Computes the Kullback-Leibler divergence between two random vectors distributed according to multivariate Cauchy distributions (MCD) with zero location vector.

Usage

kldcauchy(Sigma1, Sigma2, eps = 1e-06)

Arguments

Sigma1

symmetric, positive-definite matrix. The scatter matrix of the first distribution.

Sigma2

symmetric, positive-definite matrix. The scatter matrix of the second distribution.

eps

numeric. Precision for the computation of the partial derivative of the Lauricella DD-hypergeometric function (see Details). Default: 1e-06.

Details

Given X1X_1, a random vector of Rp\mathbb{R}^p distributed according to the MCD with parameters (0,Σ1)(0, \Sigma_1) and X2X_2, a random vector of Rp\mathbb{R}^p distributed according to the MCD with parameters (0,Σ2)(0, \Sigma_2).

Let λ1,,λp\lambda_1, \dots, \lambda_p the eigenvalues of the square matrix Σ1Σ21\Sigma_1 \Sigma_2^{-1} sorted in increasing order:

λ1<<λp1<λp\lambda_1 < \dots < \lambda_{p-1} < \lambda_p

Depending on the values of these eigenvalues, the computation of the Kullback-Leibler divergence of X1X_1 from X2X_2 is given by:

where FD(p)F_D^{(p)} is the Lauricella DD-hypergeometric function defined for pp variables:

FD(p)(a;b1,...,bp;g;x1,...,xp)=m10...mp0(a)m1+...+mp(b1)m1...(bp)mp(g)m1+...+mpx1m1m1!...xpmpmp! \displaystyle{ F_D^{(p)}\left(a; b_1, ..., b_p; g; x_1, ..., x_p\right) = \sum\limits_{m_1 \geq 0} ... \sum\limits_{m_p \geq 0}{ \frac{ (a)_{m_1+...+m_p}(b_1)_{m_1} ... (b_p)_{m_p} }{ (g)_{m_1+...+m_p} } \frac{x_1^{m_1}}{m_1!} ... \frac{x_p^{m_p}}{m_p!} } }

Value

A numeric value: the Kullback-Leibler divergence between the two distributions, with two attributes attr(, "epsilon") (precision of the partial derivative of the Lauricella DD-hypergeometric function,see Details) and attr(, "k") (number of iterations).

Author(s)

Pierre Santagostini, Nizar Bouhlel

References

N. Bouhlel, D. Rousseau, A Generic Formula and Some Special Cases for the Kullback–Leibler Divergence between Central Multivariate Cauchy Distributions. Entropy, 24, 838, July 2022. doi:10.3390/e24060838

Examples


Sigma1 <- matrix(c(1, 0.6, 0.2, 0.6, 1, 0.3, 0.2, 0.3, 1), nrow = 3)
Sigma2 <- matrix(c(1, 0.3, 0.1, 0.3, 1, 0.4, 0.1, 0.4, 1), nrow = 3)
kldcauchy(Sigma1, Sigma2)
kldcauchy(Sigma2, Sigma1)

Sigma1 <- matrix(c(0.5, 0, 0, 0, 0.4, 0, 0, 0, 0.3), nrow = 3)
Sigma2 <- diag(1, 3)
# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are < 1
kldcauchy(Sigma1, Sigma2)
# Case when all eigenvalues of Sigma1 %*% solve(Sigma2) are > 1
kldcauchy(Sigma2, Sigma1)



[Package mcauchyd version 1.3.2 Index]