prauc {mlr3measures} | R Documentation |
Area Under the Precision-Recall Curve
Description
Measure to compare true observed labels with predicted probabilities in binary classification tasks.
Usage
prauc(truth, prob, positive, na_value = NaN, ...)
Arguments
truth |
( |
prob |
( |
positive |
( |
na_value |
( |
... |
( |
Details
Computes the area under the Precision-Recall curve (PRC). The PRC can be interpreted as the relationship between precision and recall (sensitivity), and is considered to be a more appropriate measure for unbalanced datasets than the ROC curve. The PRC is computed by integration of the piecewise function.
This measure is undefined if the true values are either all positive or all negative.
Value
Performance value as numeric(1)
.
Meta Information
Type:
"binary"
Range:
[0, 1]
Minimize:
FALSE
Required prediction:
prob
References
Davis J, Goadrich M (2006). “The relationship between precision-recall and ROC curves.” In Proceedings of the 23rd International Conference on Machine Learning. ISBN 9781595933836.
See Also
Other Binary Classification Measures:
auc()
,
bbrier()
,
dor()
,
fbeta()
,
fdr()
,
fn()
,
fnr()
,
fomr()
,
fp()
,
fpr()
,
gmean()
,
gpr()
,
npv()
,
ppv()
,
tn()
,
tnr()
,
tp()
,
tpr()
Examples
truth = factor(c("a", "a", "a", "b"))
prob = c(.6, .7, .1, .4)
prauc(truth, prob, "a")