SplitEntropy {TreeDist}R Documentation

Entropy of two splits

Description

Calculate the entropy, joint entropy, entropy distance and information content of two splits, treating each split as a division of n leaves into two groups. Further details are available in a vignette, MacKay (2003) and Meila (2007).

Usage

SplitEntropy(split1, split2 = split1)

Arguments

split1, split2

Logical vectors listing leaves in a consistent order, identifying each leaf as a member of the ingroup (TRUE) or outgroup (FALSE) of the split in question.

Value

A numeric vector listing, in bits:

Author(s)

Martin R. Smith (martin.smith@durham.ac.uk)

References

MacKay DJC (2003). Information Theory, Inference, and Learning Algorithms. Cambridge University Press, Cambridge. https://www.inference.org.uk/itprnn/book.pdf.

Meila M (2007). “Comparing clusterings—an information based distance.” Journal of Multivariate Analysis, 98(5), 873–895. doi:10.1016/j.jmva.2006.11.013.

See Also

Other information functions: SplitSharedInformation(), TreeInfo

Examples

A <- TRUE
B <- FALSE
SplitEntropy(c(A, A, A, B, B, B), c(A, A, B, B, B, B))

[Package TreeDist version 2.7.0 Index]