Average odds difference metric

The average odds difference metric measures the difference in false positive and false negative rates between monitored and reference groups.

Metric details

Average odds difference is a fairnes evaluation metric that can help determine whether your asset produces biased outcomes.

Scope

The average odds difference metric evaluates generative AI assets and machine learning models.

  • Types of AI assets:
    • Prompt templates
    • Machine learning models
  • Generative AI tasks: Text classification
  • Machine learning problem type: Binary classification

Scores and values

The average odds difference metric score indicates the difference in false positive and false negative rates for monitored and reference groups.

  • Range of values: 0.0-1.0
  • Best possible score: 0.0
  • Ratios:
    • At 0: Both groups have equal odds
    • Under 0: Biased outcomes for monitored group
    • Over 0: Biased outcomes for reference group

Do the math

The following formula is used for calculating false positive rate (FPR):

false positive rate formula is displayed

The following formula is used for calculating true positive rate (TPR):

true positive rate formula is displayed

The following formula is used for calculating average odds difference:

average odds difference formula is displayed