JE {philentropy} | R Documentation |
Shannon's Joint-Entropy H(X,Y)
Description
This funciton computes Shannon's Joint-Entropy H(X,Y) = - \sum \sum P(X,Y) *
log2(P(X,Y))
based on a given joint-probability vector P(X,Y)
.
Usage
JE(x, unit = "log2")
Arguments
x |
a numeric joint-probability vector |
unit |
a character string specifying the logarithm unit that shall be used to compute distances that depend on log computations. |
Value
a numeric value representing Shannon's Joint-Entropy in bit.
Author(s)
Hajk-Georg Drost
References
Shannon, Claude E. 1948. "A Mathematical Theory of Communication". Bell System Technical Journal 27 (3): 379-423.
See Also
H
, CE
, KL
, JSD
, gJSD
, distance
Examples
JE(1:100/sum(1:100))
[Package philentropy version 0.8.0 Index]