

Annika Abraham
Class of 2026Sunnyvale, California
About
Projects
- "Bias and Fairness Evaluation in Predictive Models of Recidivism" with mentor Luyuan Jennifer (Sept. 9, 2025)
 
Annika's Symposium Presentation
Project Portfolio
Bias and Fairness Evaluation in Predictive Models of Recidivism
Started July 9, 2024
Abstract or project description
As machine learning models are increasingly used and depended on in high-stakes domains like criminal justice, concerns over their fairness across demographic groups have become a rising issue. This study evaluates the fairness and performance of three machine learning models (Random Forest, XGBoost, and Logistic Regression) using data from the Georgia Department of Corrections. The dataset contains detailed demographic and criminal history records, and tracks recidivism across a three‐year post-release period.
The models were evaluated using performance and fairness metrics, including demographic parity, equalized odds, and disparate impact. Fairness checks were conducted across racial, gender, and age-based subgroups. Decision trees and feature importance charts were generated to assess the influence of demographic variables on the model’s prediction processes.
Over the course of this analysis, results have indicated that age and gender saw the largest disparities across their sub-groups, out of the three demographic variables. In terms of model performance and accuracy, the XGBoost model did the best job at balancing predictive performance with fair evaluations, followed by the random forest and logistic regression models. Through exploring the risk of perpetuating bias by using machine learning models in America’s criminal justice system, this study found that when used, models are most swayed by age. In my future work, I hope to explore fairness mitigation strategies and whether artificial intelligence will help or hurt the criminal justice system.