KL.dist {FNN}R Documentation

Kullback-Leibler Divergence

Description

Compute Kullback-Leibler symmetric distance.

Usage

  KL.dist(X, Y, k = 10, algorithm=c("kd_tree", "cover_tree", "brute"))
  KLx.dist(X, Y, k = 10, algorithm="kd_tree")    

Arguments

X

An input data matrix.

Y

An input data matrix.

k

The maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Details

Kullback-Leibler distance is the sum of divergence q(x) from p(x) and p(x) from q(x) .

KL.* versions return distances from C code to R but KLx.* do not.

Value

Return the Kullback-Leibler distance between X and Y.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com

References

S. Boltz, E. Debreuve and M. Barlaud (2007). “kNN-based high-dimensional Kullback-Leibler distance for tracking”. Image Analysis for Multimedia Interactive Services, 2007. WIAMIS '07. Eighth International Workshop on.

S. Boltz, E. Debreuve and M. Barlaud (2009). “High-dimensional statistical measure for region-of-interest tracking”. Trans. Img. Proc., 18:6, 1266–1283.

See Also

KL.divergence.

Examples

    set.seed(1000)
    X<- rexp(10000, rate=0.2)
    Y<- rexp(10000, rate=0.4)
    
    KL.dist(X, Y, k=5)                 
    KLx.dist(X, Y, k=5) 
    #thoretical distance = (0.2-0.4)^2/(0.2*0.4) = 0.5
    

[Package FNN version 1.1.4 Index]