entropy.MillerMadow {entropy}R Documentation

Miller-Madow Entropy Estimator

Description

entropy.MillerMadow estimates the Shannon entropy H of the random variable Y from the corresponding observed counts y using the Miller-Madow correction to the empirical entropy).

Usage

entropy.MillerMadow(y, unit=c("log", "log2", "log10"))

Arguments

y

vector of counts.

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The Miller-Madow entropy estimator (1955) is the bias-corrected empirical entropy estimate.

Note that the Miller-Madow estimator is not a plug-in estimator, hence there are no explicit underlying bin frequencies.

Value

entropy.MillerMadow returns an estimate of the Shannon entropy.

Author(s)

Korbinian Strimmer (https://strimmerlab.github.io).

References

Miller, G. 1955. Note on the bias of information estimates. Info. Theory Psychol. Prob. Methods II-B:95-100.

See Also

entropy.empirical

Examples

# load entropy library 
library("entropy")

# observed counts for each bin
y = c(4, 2, 3, 0, 2, 4, 0, 0, 2, 1, 1)  

# estimate entropy using Miller-Madow method
entropy.MillerMadow(y)

# compare to empirical estimate
entropy.empirical(y)

[Package entropy version 1.3.1 Index]