performance.metrics {gecko} | R Documentation |
Performance of model predictions
Description
Calculate the performance of a model through a comparison
between predicted and observed labels. Available metrics are accuracy
,
F1
and TSS
.
Usage
performance.metrics(actual, predicted, metric)
Arguments
actual |
dataframe. Same formatting as |
predicted |
dataframe. Same formatting as |
metric |
character. String specifying the metric used, one of |
Details
The F-score or F-measure (F1) is:
, with
Accuracy is:
The Pierce's skill score (PSS), Bookmaker's Informedness (BM) or True Skill Statistic (TSS) is:
,
with being the True Positive Rate, positives correctly labelled
as such and
, the True Negative Rate, the rate of negatives correctly
labelled, such that:
Take in consideration the fact that the F1 score is not a robust metric in datasets with class imbalances.
Value
numeric.
References
PSS: Peirce, C. S. (1884). The numerical measure of the success of predictions. Science, 4, 453–454.
Examples
observed = c("FALSE", "TRUE", "FALSE", "TRUE", "TRUE")
predicted = c("TRUE", "TRUE", "TRUE", "TRUE", "TRUE")
performance.metrics(observed, predicted, "TSS")