H {philentropy} | R Documentation |
Shannon's Entropy H(X)
Description
Compute the Shannon's Entropy H(X) = - \sum P(X) * log2(P(X))
based on a
given probability vector P(X)
.
Usage
H(x, unit = "log2")
Arguments
x |
a numeric probability vector |
unit |
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. |
Details
This function might be useful to fastly compute Shannon's Entropy for any given probability vector.
Value
a numeric value representing Shannon's Entropy in bit.
Author(s)
Hajk-Georg Drost
References
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
See Also
Examples
H(1:10/sum(1:10))
[Package philentropy version 0.8.0 Index]