Cleansing Data and Bias within Predictive Policing Algorithms

dc.contributor.advisorTaghi Hajiaghayi, Mohammad
dc.contributor.authorArellano, Trina
dc.contributor.authorChen, Alex
dc.contributor.authorDu, Allen
dc.contributor.authorEichstadt, Andrea
dc.contributor.authorLin, Aaron
dc.contributor.authorSamuels, Nicole
dc.contributor.authorTao, Grace
dc.contributor.authorTasneem, Zoya
dc.contributor.authorVersace, Rios
dc.date.accessioned2025-06-10T18:40:42Z
dc.date.issued2025-05
dc.descriptionGemstone Team GAHSP
dc.description.abstractHot spots policing—the allocation of police resources toward high-crime areas—has been revolutionized by machine learning. Instead of relying on historical crime hot spots, predictive policing algorithms allow departments to allocate officers to where crime is expected to occur next. This has led to their increasing adoption by especially large police departments, as well as modest reductions in crime. However, predictive policing algorithms have thus been shown to exhibit similar biases to traditional policing methods. A vast literature has shown that nonwhite areas are more frequently policed, and that laws are disproportionately enforced against nonwhites in these communities. This creates a problem for predictive policing; since these algorithms are trained on historical crime data which reflects these racial biases, predictions come to perpetuate racial bias into the future. As such, our team has built a new predictive algorithm which not only uses more contemporary machine learning techniques, but directly accounts for demographic fairness in its predictive judgments. Using real crime data, we then tested our model against PredPol, a state-of-the-art predictive policing software, comparing them on predictive accuracy and on racial bias in their predictions. Our results showed that our model outperformed PredPol in both predictive accuracy and fairness, demonstrating that it is possible to make policing more equitable without sacrificing the predictive accuracy of these algorithms.
dc.identifierhttps://doi.org/10.13016/iuph-sigu
dc.identifier.urihttp://hdl.handle.net/1903/33917
dc.language.isoen_US
dc.relation.isAvailableAtDigital Repository at the University of Maryland
dc.relation.isAvailableAtGemstone Program, University of Maryland (College Park, Md)
dc.subjectGemstone Team GAHSP (Generating an Algorithm for Hot Spots Policing)
dc.subjectpolicing
dc.titleCleansing Data and Bias within Predictive Policing Algorithms
dc.typeThesis

Files

Original bundle

Now showing 1 - 1 of 1
Loading...
Thumbnail Image
Name:
Gemstone_Final_Thesis_2025_GAHSP.pdf
Size:
622.51 KB
Format:
Adobe Portable Document Format