normdiff {gaussDiff} | R Documentation |
Difference measures for multivariate Gaussian pdfs
Description
Various difference measures for Gaussian pdfs are implemented: Euclidean distance of the means, Mahalanobis distance, Kullback-Leibler divergence, J-Coefficient, Minkowski L2-distance, Chi-square divergence and the Hellinger coefficient which is a similarity measure.
Usage
normdiff(mu1,sigma1=NULL,mu2,sigma2=sigma1,inv=FALSE,s=0.5,
method=c("Mahalanobis","KL","J","Chisq",
"Hellinger","L2","Euclidean"))
Arguments
mu1 |
mean value of pdf 1, a vector |
sigma1 |
covariance matrix of pdf 1 |
mu2 |
mean value of pdf 2, a vector |
sigma2 |
covariance matrix of pdf 2 |
method |
difference measure to be used, see below |
inv |
if TRUE, 1-Hellinger is reported, default: |
s |
exponent for Hellinger coefficient, default: |
Details
Equations can be found in H.-H. Bock, Analysis of Symbolic Data, Chapter Dissimilarity Measures for Probability Distributions
Value
A scalar object of class normdiff
reporting the distance.
Author(s)
Henning Rust, henning.rust@met.fu-berlin.de
References
H.-H. Bock, Analysis of Symbolic Data, Chapter Dissimilarity measures for Probabilistic Distributions
Examples
library(gaussDiff)
mu1 <- c(0,0,0)
sig1 <- diag(c(1,1,1))
mu2 <- c(1,1,1)
sig2 <- diag(c(0.5,0.5,0.5))
## Euclidean distance
normdiff(mu1=mu1,mu2=mu2,method="Euclidean")
## Mahalanobis distance
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,method="Mahalanobis")
## Kullback-Leibler divergence
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="KL")
## J-Coefficient
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="J")
## Chi-sqr divergence
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Chisq")
## Minkowsi L2 distance
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="L2")
## Hellinger coefficient
normdiff(mu1=mu1,sigma1=sig1,mu2=mu2,sigma2=sig2,method="Hellinger")