multiinformation {infotheo} | R Documentation |
multiinformation computation
Description
multiinformation
takes a dataset as input and computes the
multiinformation (also called total correlation) among the random variables in the dataset.
The value is returned in nats using the entropy estimator estimator
.
Usage
multiinformation(X, method ="emp")
Arguments
X |
data.frame containing a set of random variables where columns contain variables/features and rows contain outcomes/samples. |
method |
The name of the entropy estimator. The package implements four estimators :
"emp", "mm", "shrink", "sg" (default:"emp") - see details.
These estimators require discrete data values - see |
Details
"emp" : This estimator computes the entropy of the empirical probability distribution.
"mm" : This is the Miller-Madow asymptotic bias corrected empirical estimator.
"shrink" : This is a shrinkage estimate of the entropy of a Dirichlet probability distribution.
"sg" : This is the Schurmann-Grassberger estimate of the entropy of a Dirichlet probability distribution.
Value
multiinformation
returns the multiinformation (also called total correlation) among the variables in the dataset (in nats).
Author(s)
Patrick E. Meyer
References
Meyer, P. E. (2008). Information-Theoretic Variable Selection and Network Inference from Microarray Data. PhD thesis of the Universite Libre de Bruxelles.
Studeny, M. and Vejnarova, J. (1998). The multiinformation function as a tool for measuring stochastic dependence. In Proceedings of the NATO Advanced Study Institute on Learning in graphical models,
See Also
condinformation
, mutinformation
, interinformation
, natstobits
Examples
data(USArrests)
dat<-discretize(USArrests)
M <- multiinformation(dat)