entPlot {mclust} | R Documentation |
Plot Entropy Plots
Description
Plot "entropy plots" to help select the number of classes from a hierarchy of combined clusterings.
Usage
entPlot(z, combiM, abc = c("standard", "normalized"), reg = 2, ...)
Arguments
z |
A matrix whose |
combiM |
A list of "combining matrices" (as provided by |
abc |
Choose one or more of: "standard", "normalized", to specify whether the number of observations involved in each combining step should be taken into account to scale the plots or not. |
reg |
The number of parts of the piecewise linear regression for the entropy plots. Choose one or more of: 2 (for 1 change-point), 3 (for 2 change-points). |
... |
Other graphical arguments to be passed to the plot functions. |
Details
Please see the article cited in the references for more details. A clear elbow in the "entropy plot" should suggest the user to consider the corresponding number(s) of class(es).
Value
if abc = "standard"
, plots the entropy against the number of clusters and the difference between the entropy of successive combined solutions against the number of clusters.
if abc = "normalized"
, plots the entropy against the cumulated number of observations involved in the successive combining steps and the difference between the entropy of successive combined solutions divided by the number of observations involved in the corresponding combining step against the number of clusters.
Author(s)
J.-P. Baudry, A. E. Raftery, L. Scrucca
References
J.-P. Baudry, A. E. Raftery, G. Celeux, K. Lo and R. Gottardo (2010). Combining mixture components for clustering. Journal of Computational and Graphical Statistics, 19(2):332-353.
See Also
plot.clustCombi
, combiPlot
, clustCombi
Examples
data(Baudry_etal_2010_JCGS_examples)
# run Mclust to get the MclustOutput
output <- clustCombi(data = ex4.2, modelNames = "VII")
entPlot(output$MclustOutput$z, output$combiM, reg = c(2,3))
# legend: in red, the single-change-point piecewise linear regression;
# in blue, the two-change-point piecewise linear regression.