AutoScore_testing {AutoScore}R Documentation

AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)

Description

AutoScore STEP(v): Evaluate the final score with ROC analysis (AutoScore Module 6)

Usage

AutoScore_testing(
  test_set,
  final_variables,
  cut_vec,
  scoring_table,
  threshold = "best",
  with_label = TRUE,
  metrics_ci = TRUE
)

Arguments

test_set

A processed data.frame that contains data for testing purpose. This data.frame should have same format as train_set (same variable names and outcomes)

final_variables

A vector containing the list of selected variables, selected from Step(ii) AutoScore_parsimony. Run vignette("Guide_book", package = "AutoScore") to see the guidebook or vignette.

cut_vec

Generated from STEP(iii) AutoScore_weighting.Please follow the guidebook

scoring_table

The final scoring table after fine-tuning, generated from STEP(iv) AutoScore_fine_tuning.Please follow the guidebook

threshold

Score threshold for the ROC analysis to generate sensitivity, specificity, etc. If set to "best", the optimal threshold will be calculated (Default:"best").

with_label

Set to TRUE if there are labels in the test_set and performance will be evaluated accordingly (Default:TRUE). Set it to "FALSE" if there are not "label" in the "test_set" and the final predicted scores will be the output without performance evaluation.

metrics_ci

whether to calculate confidence interval for the metrics of sensitivity, specificity, etc.

Value

A data frame with predicted score and the outcome for downstream visualization.

References

See Also

AutoScore_rank, AutoScore_parsimony, AutoScore_weighting, AutoScore_fine_tuning, print_roc_performance, Run vignette("Guide_book", package = "AutoScore") to see the guidebook or vignette.

Examples

## Please see the guidebook or vignettes

[Package AutoScore version 1.0.0 Index]