jeffreyspar {dad} | R Documentation |
Jeffreys measure between Gaussian densities given their parameters
Description
Jeffreys measure (or symmetrised Kullback-Leibler divergence) between two multivariate (p > 1
) or univariate (p = 1
) Gaussian densities, given their parameters (mean vectors and covariance matrices if they are multivariate, means and variances if univariate) (see Details).
Usage
jeffreyspar(mean1, var1, mean2, var2, check = FALSE)
Arguments
mean1 |
|
var1 |
|
mean2 |
|
var2 |
|
check |
logical. When |
Details
Let m1
and m2
the mean vectors, v1
and v2
the covariance matrices, Jeffreys measure of the two Gaussian densities is equal to:
(1/2) t(m1 - m2) (v1^{-1} + v2^{-1}) (m1 - m2) - (1/2) tr( (v1 - v2) (v1^{-1} - v2^{-1}) )
.
If p = 1
the means and variances are numbers, the formula is the same ignoring the following operators: t (transpose of a matrix or vector) and tr (trace of a square matrix).
Value
Jeffreys measure between two Gaussian densities.
Be careful! If check = FALSE
and one covariance matrix is degenerated (multivariate case) or one variance is zero (univariate case), the result returned must not be considered.
Author(s)
Rachid Boumaza, Pierre Santagostini, Smail Yousfi, Gilles Hunault, Sabine Demotes-Mainard
References
McLachlan, G.J. (1992). Discriminant analysis and statistical pattern recognition. John Wiley & Sons, New York .
Thabane, L., Safiul Haq, M. (1999). On Bayesian selection of the best population using the Kullback-Leibler divergence measure. Statistica Neerlandica, 53(3): 342-360.
See Also
jeffreys: Jeffreys measure of two parametrically estimated Gaussian densities, given samples.
Examples
m1 <- c(1,1)
v1 <- matrix(c(4,1,1,9),ncol = 2)
m2 <- c(0,1)
v2 <- matrix(c(1,0,0,1),ncol = 2)
jeffreyspar(m1,v1,m2,v2)