entropy.estimate {vsgoftest} | R Documentation |
Vasicek estimate of differential Shannon Entropy
Description
Computes Vasicek estimate of differential Shannon entropy from a numeric sample.
Usage
entropy.estimate(x,window)
Arguments
x |
( |
window |
( |
Details
Vasicek estimator of Shannon entropy is defined, for a random sample X_1, \dots, X_n
, by
\frac{1}{n}\sum_{i=1}^{n} \log (\frac{n}{2m}[X_{(i+m)}-X_{(i-m)}]),
where X_{(i)}
is the order statistic, m<(n/2)
is the window size, and X_{(i)}=X_{(1)}
for i<1
and X_{(i)}=X_{(n)}
for i>n
.
Value
A single numeric value representing the Vasicek estimate of entropy of the sample
Author(s)
J. Lequesne justine.lequesne@unicaen.fr
References
Vasicek, O., A test for normality based on sample entropy, Journal of the Royal Statistical Society, 38(1), 54-59 (1976).
See Also
vs.test
which performs Vasicek-Song goodness-of-fit tests to the specified maximum entropy distribution family.
Examples
set.seed(2)
samp <- rnorm(100, mean = 0, s = 1)
entropy.estimate(x = samp, window = 8)
log(2*pi*exp(1))/2 #true value of entropy of normal distribution