information_gain {msu} | R Documentation |
Estimating information gain between two categorical variables.
Description
Information gain (also called mutual information) is a measure of the mutual dependence between two variables (see https://en.wikipedia.org/wiki/Mutual_information).
Usage
information_gain(x, y)
IG(x, y)
Arguments
x |
A factor representing a categorical variable. |
y |
A factor representing a categorical variable. |
Value
Information gain estimation based on Sannon entropy for
variables x
and y
.
Examples
information_gain(factor(c(0,1)), factor(c(1,0)))
information_gain(factor(c(0,0,1,1)), factor(c(0,1,1,1)))
information_gain(factor(c(0,0,1,1)), factor(c(0,1,0,1)))
## Not run:
information_gain(c(0,1), c(1,0))
## End(Not run)
[Package msu version 0.0.1 Index]