mi.plugin {entropy}R Documentation

Plug-In Estimator of Mutual Information and of the Chi-Squared Statistic of Independence

Description

mi.plugin computes the mutual information of two discrete random variables from the specified joint probability mass function.

chi2indep.plugin computes the chi-squared divergence of independence.

Usage

mi.plugin(freqs2d, unit=c("log", "log2", "log10"))
chi2indep.plugin(freqs2d, unit=c("log", "log2", "log10"))

Arguments

freqs2d

matrix of joint bin frequencies (joint probability mass function).

unit

the unit in which entropy is measured. The default is "nats" (natural units). For computing entropy in "bits" set unit="log2".

Details

The mutual information of two random variables X and Y is the Kullback-Leibler divergence between the joint density/probability mass function and the product independence density of the marginals.

It can also defined using entropy as MI = H(X) + H(Y) - H(X, Y).

Similarly, the chi-squared divergence of independence is the chi-squared divergence between the joint density and the product density. It is a second-order approximation of twice the mutual information.

Value

mi.plugin returns the mutual information.

chi2indep.plugin returns the chi-squared divergence of independence.

Author(s)

Korbinian Strimmer (https://strimmerlab.github.io).

See Also

mi.Dirichlet, mi.shrink, mi.empirical, KL.plugin, discretize2d.

Examples

# load entropy library 
library("entropy")

# joint distribution of two discrete variables
freqs2d = rbind( c(0.2, 0.1, 0.15), c(0.1, 0.2, 0.25) )  

# corresponding mutual information
mi.plugin(freqs2d)

# MI computed via entropy
H1 = entropy.plugin(rowSums(freqs2d))
H2 = entropy.plugin(colSums(freqs2d))
H12 = entropy.plugin(freqs2d)
H1+H2-H12

# and corresponding (half) chi-squared divergence of independence
0.5*chi2indep.plugin(freqs2d)


[Package entropy version 1.3.1 Index]