entropy {cooltools} | R Documentation |
Information entropy
Description
Computes the information entropy H=sum(p*log_b(p)), also known as Shannon entropy, of a probability vector p.
Usage
entropy(p, b = exp(1), normalize = TRUE)
Arguments
p |
vector of probabilities; typically normalized, such that sum(p)=1. |
b |
base of the logarithm (default is e) |
normalize |
logical flag. If TRUE (default), the vector p is automatically normalized. |
Value
Returns the information entropy in units that depend on b. If b=2, the units are bits; if b=exp(1), the units are nats; if b=10, the units are dits.
Author(s)
Danail Obreschkow
[Package cooltools version 2.4 Index]