acc_parity {fairness}R Documentation

Accuracy parity

Description

This function computes the Accuracy parity metric

Formula: (TP + TN) / (TP + FP + TN + FN)

Usage

acc_parity(
  data,
  outcome,
  group,
  probs = NULL,
  preds = NULL,
  outcome_base = NULL,
  cutoff = 0.5,
  base = NULL,
  group_breaks = NULL
)

Arguments

data

Data.frame that contains the necessary columns.

outcome

Column name indicating the binary outcome variable (character).

group

Column name indicating the sensitive group (character).

probs

Column name or vector with the predicted probabilities (numeric between 0 - 1). Either probs or preds need to be supplied.

preds

Column name or vector with the predicted binary outcome (0 or 1). Either probs or preds need to be supplied.

outcome_base

Base level of the outcome variable (i.e., negative class). Default is the first level of the outcome variable.

cutoff

Cutoff to generate predicted outcomes from predicted probabilities. Default set to 0.5.

base

Base level of the sensitive group (character).

group_breaks

If group is continuous (e.g., age): either a numeric vector of two or more unique cut points or a single number >= 2 giving the number of intervals into which group feature is to be cut.

Details

This function computes the Accuracy parity metric as described by Friedler et al., 2018. Accuracy metrics are calculated by the division of correctly predicted observations (the sum of all true positives and true negatives) with the number of all predictions. In the returned named vector, the reference group will be assigned 1, while all other groups will be assigned values according to whether their accuracies are lower or higher compared to the reference group. Lower accuracies will be reflected in numbers lower than 1 in the returned named vector, thus numbers lower than 1 mean WORSE prediction for the subgroup.

Value

Metric

Raw accuracy metrics for all groups and metrics standardized for the base group (accuracy parity metric). Lower values compared to the reference group mean lower accuracies in the selected subgroups

Metric_plot

Bar plot of Accuracy parity metric

Probability_plot

Density plot of predicted probabilities per subgroup. Only plotted if probabilities are defined

Examples

data(compas)
compas$Two_yr_Recidivism_01 <- ifelse(compas$Two_yr_Recidivism == 'yes', 1, 0) 
acc_parity(data = compas, outcome = 'Two_yr_Recidivism_01', group = 'ethnicity',
probs = 'probability', cutoff = 0.4, base = 'Caucasian')
acc_parity(data = compas, outcome = 'Two_yr_Recidivism_01', group = 'ethnicity',
preds = 'predicted', cutoff = 0.5, base = 'Hispanic')


[Package fairness version 1.2.2 Index]