Kullback-Leibler Divergence between Centered Multivariate t Distributions
Description
Computes the Kullback-Leibler divergence between two random vectors distributed
according to multivariate t distributions (MTD) with zero location vector.
Usage
kldstudent(nu1, Sigma1, nu2, Sigma2, eps = 1e-06)
Arguments
nu1
numeric. The degrees of freedom of the first distribution.
Sigma1
symmetric, positive-definite matrix. The scatter matrix of the first distribution.
nu2
numeric. The degrees of freedom of the second distribution.
Sigma2
symmetric, positive-definite matrix. The scatter matrix of the second distribution.
eps
numeric. Precision for the computation of the partial derivative of the Lauricella D-hypergeometric function (see Details). Default: 1e-06.
Details
Given X1, a random vector of Rp distributed according to the centered MTD
with parameters (ν1,0,Σ1)
and X2, a random vector of Rp distributed according to the MCD
with parameters (ν2,0,Σ2).
Let λ1,…,λp the eigenvalues of the square matrix Σ1Σ2−1
sorted in increasing order:
λ1<⋯<λp−1<λp
The Kullback-Leibler divergence of X1 from X2 is given by:
A numeric value: the Kullback-Leibler divergence between the two distributions,
with two attributes attr(, "epsilon") (precision of the partial derivative of the Lauricella D-hypergeometric function,see Details)
and attr(, "k") (number of iterations).
Author(s)
Pierre Santagostini, Nizar Bouhlel
References
N. Bouhlel and D. Rousseau (2023), Exact Rényi and Kullback-Leibler Divergences Between Multivariate t-Distributions, IEEE Signal Processing Letters.
doi:10.1109/LSP.2023.3324594