KL.z {EntropyEstimation}R Documentation

KL.z

Description

Returns the Z estimator of Kullback-Leibler Divergence, which has exponentially decaying bias. See Zhang and Grabchak (2014b) for details.

Usage

KL.z(x, y)

Arguments

x

Vector of counts from the first distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

y

Vector of counts from the second distribution. Must be integer valued. Each entry represents the number of observations of a distinct letter.

Author(s)

Lijuan Cao and Michael Grabchak

References

Z. Zhang and M. Grabchak (2014b). Nonparametric Estimation of Kullback-Leibler Divergence. Neural Computation, 26(11): 2570-2593.

Examples

 x = c(1,3,7,4,8) 
 y = c(2,5,1,3,6) 
 KL.z(x,y)  
 KL.z(y,x)  

[Package EntropyEstimation version 1.2 Index]