Distance/Divergence between Centered Multivariate t Distributions
Description
Computes the distance or divergence (Renyi divergence, Bhattacharyya
distance or Hellinger distance) between two random vectors distributed
according to multivariate $t$ distributions (MTD) with zero mean vector.
numéric. The degrees of freedom of the first distribution.
Sigma1
symmetric, positive-definite matrix. The correlation matrix of the first distribution.
nu2
numéric. The degrees of freedom of the second distribution.
Sigma2
symmetric, positive-definite matrix. The correlation matrix of the second distribution.
dist
character. The distance or divergence used.
One of "renyi" (default), "battacharyya" or "hellinger".
bet
numeric, positive and not equal to 1. Order of the Renyi divergence.
Ignored if distance="bhattacharyya" or distance="hellinger".
eps
numeric. Precision for the computation of the partial derivative of the Lauricella D-hypergeometric function (see Details). Default: 1e-06.
Details
Given X1, a random vector of Rp distributed according to the MTD
with parameters (ν1,0,Σ1)
and X2, a random vector of Rp distributed according to the MTD
with parameters (ν2,0,Σ2).
Let δ1=2ν1+pβ, δ2=2ν2+p(1−β)
and λ1,…,λp the eigenvalues of the square matrix Σ1Σ2−1
sorted in increasing order:
A numeric value: the Renyi divergence between the two distributions,
with two attributes attr(, "epsilon") (precision of the result of the Lauricella D-hypergeometric function,see Details)
and attr(, "k") (number of iterations).
Author(s)
Pierre Santagostini, Nizar Bouhlel
References
N. Bouhlel and D. Rousseau (2023), Exact Rényi and Kullback-Leibler Divergences Between Multivariate t-Distributions, IEEE Signal Processing Letters.
doi:10.1109/LSP.2023.3324594