classification_metrics {Momocs} | R Documentation |
Calculate classification metrics on a confusion matrix
Description
In some cases, the class correctness or the proportion of correctly classified individuals is not enough, so here are more detailed metrics when working on classification.
Usage
classification_metrics(x)
Arguments
x |
a |
Value
a list with the following components is returned:
-
accuracy
the fraction of instances that are correctly classified -
macro_prf
data.frame containingprecision
(the fraction of correct predictions for a certain class);recall
, the fraction of instances of a class that were correctly predicted;f1
the harmonic mean (or a weighted average) of precision and recall. -
macro_avg
, just the average of the threemacro_prf
indices -
ova
a list of one-vs-all confusion matrices for each class -
ova_sum
a single of all ova matrices -
kappa
measure of agreement between the predictions and the actual labels
See Also
The pages below are of great interest to understand these metrics. The code used is partley derived from the Revolution Analytics blog post (with their authorization). Thanks to them!
-
https://blog.revolutionanalytics.com/2016/03/com_class_eval_metrics_r.html
Other multivariate:
CLUST()
,
KMEANS()
,
KMEDOIDS()
,
LDA()
,
MANOVA_PW()
,
MANOVA()
,
MDS()
,
MSHAPES()
,
NMDS()
,
PCA()
Examples
# some morphometrics on 'hearts'
hearts %>% fgProcrustes(tol=1) %>%
coo_slide(ldk=1) %>% efourier(norm=FALSE) %>% PCA() %>%
# now the LDA and its summary
LDA(~aut) %>% classification_metrics()