Beyond GIGO: how "predictive policing" launders racism, corruption and bias to make them seem empirical
A new report details the Los Angeles Police Department's use of algorithms to identify "hot spots" and "chronic offenders" and target them for surveillance.
Should police gather up statistical information from a variety of sources and then use computer algorithms to try to predict who is likely to be involved in violent crime in the future? Just such an attempt has been underway in New Orleans, as the Verge reported Feb. 27, and the New Orleans Times-Picayune described in a follow-up report on March 1.
A machine learning system that predicts where white collar crimes will occur throughout the US.
Yesterday I learned about Campaign Zero, a grassroots plan to end police violence. The first step in their plan is to end Broken Windows policing. Here’s their argument: A decades-long focus …
There’s software used across the country to predict future criminals. And it’s biased against blacks.