EntropyGMM {mclustAddons} | R Documentation |
Gaussian mixture-based estimation of entropy
Description
Compute an estimate of the (differential) entropy from a Gaussian Mixture Model (GMM) fitted using the mclust package.
Usage
EntropyGMM(object, ...)
## S3 method for class 'densityMclust'
EntropyGMM(object, ...)
## S3 method for class 'Mclust'
EntropyGMM(object, ...)
## S3 method for class 'densityMclustBounded'
EntropyGMM(object, ...)
## S3 method for class 'matrix'
EntropyGMM(object, ...)
## S3 method for class 'data.frame'
EntropyGMM(object, ...)
EntropyGauss(sigma)
nats2bits(x)
bits2nats(x)
Arguments
object |
An object of class |
sigma |
A symmetric covariance matrix. |
x |
A vector of values. |
... |
Further arguments passed to or from other methods. |
Value
EntropyGMM()
returns an estimate of the entropy based on a estimated Gaussian mixture model (GMM) fitted using the mclust package. If a matrix of data values is provided, a GMM is preliminary fitted to the data and then the entropy computed.
EntropyGauss()
returns the entropy for a multivariate Gaussian distribution with covariance matrix sigma
.
nats2bits()
and bits2nats()
convert input values in nats to bits, and viceversa. Information-theoretic quantities have different units depending on the base of the logarithm used: nats are expressed in base-2 logarithms, whereas bits in natural logarithms.
Author(s)
Luca Scrucca
References
Robin S. and Scrucca L. (2023) Mixture-based estimation of entropy. Computational Statistics & Data Analysis, 177, 107582. https://doi.org/10.1016/j.csda.2022.107582
See Also
Examples
X = iris[,1:4]
mod = densityMclust(X, plot = FALSE)
h = EntropyGMM(mod)
h
bits2nats(h)
EntropyGMM(X)