EMGauss {AdaptGauss} | R Documentation |
EM Algorithm for GMM
Description
Expectation-Maximization algorithm to calculate optimal Gaussian Mixture Model for given data in one Dimension.
Usage
EMGauss(Data, K, Means, SDs,Weights, MaxNumberofIterations,fast)
Arguments
Data |
vector of data points |
K |
estimated amount of Gaussian Kernels |
Means |
vector(1:L), Means of Gaussians, L == Number of Gaussians |
SDs |
estimated Gaussian Kernels = standard deviations |
Weights |
optional, relative number of points in Gaussians (prior probabilities): sum(Weights) ==1, default weight is 1/L |
MaxNumberofIterations |
Optional, Number of Iterations; default=10 |
fast |
Default: FALSE: Using mclust's EM see function |
Details
No adding or removing of Gaussian kernels. Number of Gaussian hast to be set by the length of the vector of Means, SDs and Weights.
This EM is only for univariate data. For multivariate data see package mclust
Value
List with
Means |
means of GMM generated by EM algorithm |
SDs |
standard deviations of GMM generated by EM algorithm |
Weights |
prior probabilities of Gaussians |
Author(s)
Onno Hansen-Goos, Michael Thrun, Florian Lerch
References
Bishop, Christopher M. Pattern recognition and machine learning. springer, 2006, p 435 ff