Filed Under:  National

Lawmakers seek to stop DOJ use of algorithms and ‘dirty policing’

26th April 2021   ·   0 Comments

By Stacy M. Brown
Contributing Writer

(NNPA Newswire) — A New York University Law Review recently found that law enforcement agencies “are increasingly using predictive policing systems to forecast criminal activity and allocate police resources.”

Yet in numerous jurisdictions, these systems are built on data produced during documented periods of flawed, racially-biased, and sometimes unlawful practices and policies – or dirty policing, noted the report, titled, “Dirty Data, Bad Predictions: How Civil Rights Violations Impact Police Data, Predictive Policing Systems, and Justice.”

Eight Democratic lawmakers have taken exception to the use of algorithms that automate policing decisions, raising their concerns with the U.S. Department of Justice this week.

U.S. Reps. Yvette D. Clarke, D-N.Y., and Sheila Jackson Lee, D-Texas, and Sens. Ron Wyden, D-Ore., Elizabeth Warren, D-Mass., Edward Markey, D-Mass., Jeff Merkley, D-Ore., Alex Padilla, D-Calif., and Raphael Warnock, D-Ga., wrote a letter asking the DOJ to help ensure that any predictive policing algorithms in use are fully documented.

They asked the agency also to ensure that algorithms are subjected to ongoing, independent audits by experts and made to provide a system of due process for those affected.

“If the DOJ cannot ensure this, DOJ should halt any funding it is providing to develop and deploy these unproven tools,” the lawmakers wrote.

According to www.nextgov.com, predictive policing involves law enforcement officials implementing mathematical and predictive analytics and other technology-based techniques to pinpoint potential crimes.

In their letter, the lawmakers said two primary ways such methods are used are to predict locations where crimes could occur in a particular window or predict which individuals might be involved in future illegal acts.

“Algorithms draw from historical crime data, and at times other data elements like weather patterns or gunfire detection, to produce the forecasts,” they noted.

“But, when predictive policing systems have been exposed to scrutiny, auditors have found major problems with their effectiveness and reliability,” the letter continued.

Nextgov.com reported that the lawmakers pointed to specific reviews that sparked worry and a police department’s 2020 strategic plan that mentioned implementing such technologies with Justice Department funds.

They also referenced the New York University Law Review study that found nine out of 13 assessed law enforcement departments used what’s deemed “dirty data” – or information collected from illegal policing practices – to inform their algorithms leveraged in this sort of work.

“When datasets filled with inaccuracies influenced by historical and systemic biases are used without corrections, these algorithms end up perpetuating such biases and facilitate discriminatory policing against marginalized groups, especially Black Americans,” the lawmakers wrote.

They requested a range of detailed information from the federal department.

The information includes whether officials have analyzed if this tech’s use complies with relevant civil rights laws.

They demanded to know the names of each jurisdiction that has operated predictive policing algorithms funded by the agency and the actual software used.

The lawmakers also asked for a detailed annual accounting of all federal funding DOJ distributed related to developing and implementing predictive policing algorithms at federal, state, and local levels for fiscal years 2010 to 2020; and more.

“Deploying predictive policing systems in jurisdictions with extensive histories of unlawful police practices presents elevated risks that dirty data will lead to flawed or unlawful predictions, which in turn risk perpetuating additional harm via feedback loops throughout the criminal justice system,” New York University Law Review researchers wrote.

“The use of predictive policing must be treated with high levels of caution and mechanisms for the public to know, assess, and reject such systems are imperative.

This article originally published in the April 26, 2021 print edition of The Louisiana Weekly newspaper.

Readers Comments (0)


You must be logged in to post a comment.