get_entropy {EMMIXSSL} | R Documentation |
Shannon entropy
Description
Shannon entropy
Usage
get_entropy(dat, n, p, g, pi, mu, sigma, ncov = 2)
Arguments
dat |
An |
n |
Number of observations. |
p |
Dimension of observation vecor. |
g |
Number of multivariate normal classes. |
pi |
A g-dimensional vector for the initial values of the mixing proportions. |
mu |
A |
sigma |
A |
ncov |
Options of structure of sigma matrix; the default value is 2;
|
Details
The concept of information entropy was introduced by shannon1948mathematical.
The entropy of y_j
is formally defined as
e_j( y_j; \theta)=-\sum_{i=1}^g \tau_i( y_j; \theta) \log\tau_i(y_j;\theta).
Value
clusprobs |
The posterior probabilities of the i-th entity that belongs to the j-th group. |
Examples
n<-150
pi<-c(0.25,0.25,0.25,0.25)
sigma<-array(0,dim=c(3,3,4))
sigma[,,1]<-diag(1,3)
sigma[,,2]<-diag(2,3)
sigma[,,3]<-diag(3,3)
sigma[,,4]<-diag(4,3)
mu<-matrix(c(0.2,0.3,0.4,0.2,0.7,0.6,0.1,0.7,1.6,0.2,1.7,0.6),3,4)
dat<-rmix(n=n,pi=pi,mu=mu,sigma=sigma,ncov=2)
en<-get_entropy(dat=dat$Y,n=150,p=3,g=4,mu=mu,sigma=sigma,pi=pi,ncov=2)