NYPD

Autonomous police units could deepen racial inequality

What’s happening? The New York Police Department (NYPD) has cancelled a contract worth around $94,000 with Boston Dynamics for its robot Digidog after public criticism over the high cost and fears of police militarisation. The 32 kg Spot robot was unveiled in December to “save lives, protect people, and protect officers”, according to the NYPD. NY Mayor Bill de Blasio dubbed it “creepy”, while Congresswoman Alexandria Ocasio-Cortez asked why such “world class” technology wasn’t funneled toward education, housing or health care. The contract with Boston Dynamics was terminated in April, four months ahead of schedule. (NPR)

Why does this matter? Issues surrounding US police conduct have been thrust into the spotlight after public outcry following the deaths of George Floyd, Breonna Taylor and several other Black Americans at the hands of police officers. Derek Chauvin was found guilty of the murder of George Floyd prompting the launch of a US Justice Department investigation into Minneapolis policing.

With confidence in the police at a record low in the US, autonomous units could be considered a potential solution by removing human involvement. The inclusion, however, of bias datasets used to train policing algorithms demonstrates that the human element remains, producing racist predictive responses that could have severe social repercussions if left unchecked.

Well-documented issue  Prominent figures within AI ethics have voiced their concerns about the application of AI by large organisations. Timnit Gebru was fired from Google for suggesting a willingness to ignore the full scope of AI’s social flaws. Despite the evidence, real-world applications of policing AI are already in use and producing biased results. To address this issue, alternative algorithmic training has been implemented in some cases, using victim reports instead of arrest data. Bias, however, still remained even when chosing this process.

Policing examples – Facial-recognition technology in New Jersey misidentified a Black man which led to false imprisonment for 10 days. Police, however, continue using AI and have integrated facial recognition with autonomous robots in California to detect blacklisted passersby. Their effectiveness with mundane policing tasks has also been questionable, with one citizen reporting that while attempting to report a brawl to a robot, it told her to “step away” before launching into song.

Allocation of funds – NYPD signed a $94,200 contract for the robot prompting debate about the use of public money. The unit was originally conceived for work on industrial sites and oil fields, not police work. Purchasing such an expensive device to monitor low-income communities when they are struggling the most during the Covid-19 pandemic will likely fuel calls to defund the police and re-direct money into communities.

This article first appeared in our weekly newsletter, Sustt.

Read more articles

Sign up to Sustt

Share This Post

You might also like

There’s a lot more to food miles than you might think

What’s happening? Emissions from global “food miles” are higher than previously believed, comprising almost 20% of total food system emissions, ...

Read more

Sara Trett
July 1, 2022

Gaming companies get vocal on trans rights

What’s happening? Mala Singh, chief people officer at Electronic Arts (EA), has said the gaming company supports the trans community, ...

Read more

Sara Trett
June 22, 2022

What can we do to alleviate climate anxiety?

What’s happening? Countries must include mental health in their climate response plans, a World Health Organization (WHO) policy brief urges. ...

Read more

Nicola Watts
June 17, 2022