mi {fastmit} | R Documentation |
kNN Mutual Information Estimators
Description
Estimate mutual information based on the distribution of nearest neighborhood distances. The kNN method is described by Kraskov, et. al (2004).
Usage
mi(x, y, k = 5, distance = FALSE)
Arguments
x |
A numeric vector, matrix, data.frame or |
y |
A numeric vector, matrix, data.frame or |
k |
Order of neighborhood to be used in the kNN method. |
distance |
Bool flag for considering |
Details
If two samples are passed to arguments x
and y
, the sample sizes
(i.e. number of rows of the matrix or length of the vector) must agree.
Moreover, data being passed to x
and y
must not contain missing or infinite values.
Value
mi |
The estimated mutual information. |
References
Kraskov, A., Stögbauer, H., & Grassberger, P. (2004). Estimating mutual information. Physical review E 69(6): 066138.
Examples
library(fastmit)
set.seed(1)
x <- rnorm(100)
y <- x + rnorm(100)
mi(x, y, k = 5, distance = FALSE)
set.seed(1)
x <- rnorm(100)
y <- 100 * x + rnorm(100)
distx <- dist(x)
disty <- dist(y)
mi(distx, disty, k = 5, distance = TRUE)