entropy {FNN}R Documentation

Shannon Entropy

Description

KNN Shannon Entropy Estimators.

Usage

  entropy(X, k = 10, algorithm = c("kd_tree", "brute"))

Arguments

X

an input data matrix.

k

the maximum number of nearest neighbors to search. The default value is set to 10.

algorithm

nearest neighbor search algorithm.

Value

a vector of length k for entropy estimates using 1:k nearest neighbors, respectively.

Author(s)

Shengqiao Li. To report any bugs or suggestions please email: lishengqiao@yahoo.com

References

H. Singh, N. Misra, V. Hnizdo, A. Fedorowicz and E. Demchuk (2003). “Nearest neighbor estimates of entropy”. American Journal of Mathematical and Management Sciences, 23, 301-321.

M.N. Goria, N.N.Leonenko, V.V. Mergel and P.L. Novi Inverardi (2005). “A new class of random vector entropy estimators and its applications in testing statistical hypotheses”. Journal of Nonparametric Statistics, 17:3, 277–297.

R.M. Mnatsakanov, N. Misra, S. Li and E.J. Harner (2008). “K_n-nearest neighbor estimators of entropy”. Mathematical Methods of Statistics, 17:3, 261-277.


[Package FNN version 1.1.4 Index]