Predictive Policing Software Shown to Entrench Bias, not Address It

(Photo: Scott Rodgerson/Unsplash)Automation can be the key to unlocking efficiency—but what happens when the algorithms at the core of an automated process perpetuate racial and economic biases? A new analysis of popular predictive policing software by Gizmodo has concluded that such software disproportionately targets poor communities and communities of color. Gizmodo’s analysis
Subscribe to iCameroon.Com Newsletter