AutoScore_testing {AutoScore} | R Documentation |
AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)
Description
AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)
Usage
AutoScore_testing(
test_set,
final_variables,
cut_vec,
scoring_table,
threshold = "best",
with_label = TRUE,
metrics_ci = TRUE
)
Arguments
test_set |
A processed |
final_variables |
A vector containing the list of selected variables, selected from Step(ii) |
cut_vec |
Generated from STEP(iii) |
scoring_table |
The final scoring table after fine-tuning, generated from STEP(iv) |
threshold |
Score threshold for the ROC analysis to generate sensitivity, specificity, etc. If set to "best", the optimal threshold will be calculated (Default:"best"). |
with_label |
Set to TRUE if there are labels in the test_set and performance will be evaluated accordingly (Default:TRUE). Set it to "FALSE" if there are not "label" in the "test_set" and the final predicted scores will be the output without performance evaluation. |
metrics_ci |
whether to calculate confidence interval for the metrics of sensitivity, specificity, etc. |
Value
A data frame with predicted score and the outcome for downstream visualization.
References
Xie F, Chakraborty B, Ong MEH, Goldstein BA, Liu N. AutoScore: A Machine Learning-Based Automatic Clinical Score Generator and Its Application to Mortality Prediction Using Electronic Health Records. JMIR Medical Informatics 2020;8(10):e21798
See Also
AutoScore_rank
, AutoScore_parsimony
, AutoScore_weighting
, AutoScore_fine_tuning
, print_roc_performance
, Run vignette("Guide_book", package = "AutoScore")
to see the guidebook or vignette.
Examples
## Please see the guidebook or vignettes