| gpr {mlr3measures} | R Documentation |
Geometric Mean of Precision and Recall
Description
Measure to compare true observed labels with predicted labels in binary classification tasks.
Usage
gpr(truth, response, positive, na_value = NaN, ...)
Arguments
truth |
( |
response |
( |
positive |
( |
na_value |
( |
... |
( |
Details
Calculates the geometric mean of precision() P and recall() R as
\sqrt{\mathrm{P} \mathrm{R}}.
This measure is undefined if precision or recall is undefined, i.e. if TP + FP = 0 or if TP + FN = 0.
Value
Performance value as numeric(1).
Meta Information
Type:
"binary"Range:
[0, 1]Minimize:
FALSERequired prediction:
response
References
He H, Garcia EA (2009). “Learning from Imbalanced Data.” IEEE Transactions on knowledge and data engineering, 21(9), 1263–1284. doi:10.1109/TKDE.2008.239.
See Also
Other Binary Classification Measures:
auc(),
bbrier(),
dor(),
fbeta(),
fdr(),
fn(),
fnr(),
fomr(),
fp(),
fpr(),
gmean(),
npv(),
ppv(),
prauc(),
tn(),
tnr(),
tp(),
tpr()
Examples
set.seed(1)
lvls = c("a", "b")
truth = factor(sample(lvls, 10, replace = TRUE), levels = lvls)
response = factor(sample(lvls, 10, replace = TRUE), levels = lvls)
gpr(truth, response, positive = "a")