copent {copent}R Documentation

Estimating copula entropy

Description

Estimating copula entropy nonparametrically.

Usage

copent(x,k=3,dt=2)

Arguments

x

the data with each row as a sample.

k

kth nearest neighbour, default = 3.

dt

the type of distance between samples, 1 for Eclidean distance; 2 for Maximum distance.

Details

This program involves estimating copula entropy from data nonparametrically. It was proposed in Ma and Sun (2008, 2011).

The algorithm composes of two simple steps: estimating empirical copula by rank statistic using construct_empirical_copula and then estimating copula entropy with kNN method using entknn proposed in Kraskov et al (2004).

The argument x is for the data with each row as a sample from random variables. The argument k and dt is used in the kNN method for estimating entropy. k is for the kth nearest neighbour (default = 3) and dt is for the type of distance between samples which has currently two value options (1 for Eclidean distance, and 2(default) for Maximum distance).

Copula Entropy is proved to be equivalent to negative mutual information so this program can also be used to estimate multivariate mutual information.

Value

The function returns negative value of copula entropy of data x.

References

Ma, J., & Sun, Z. (2011). Mutual information is copula entropy. Tsinghua Science & Technology, 16(1): 51-54. See also arXiv preprint arXiv:0808.0845, 2008.

Kraskov, A., St\"ogbauer, H., & Grassberger, P. (2004). Estimating Mutual Information. Physical Review E, 69(6), 66138.

Examples


library(mnormt)
rho <- 0.5
sigma <- matrix(c(1,rho,rho,1),2,2)
x <- rmnorm(500,c(0,0),sigma)
ce1 <- copent(x,3,2)


[Package copent version 0.4 Index]