NYPD

Autonomous police units could deepen racial inequality

What’s happening? The New York Police Department (NYPD) has cancelled a contract worth around $94,000 with Boston Dynamics for its robot Digidog after public criticism over the high cost and fears of police militarisation. The 32 kg Spot robot was unveiled in December to “save lives, protect people, and protect officers”, according to the NYPD. NY Mayor Bill de Blasio dubbed it “creepy”, while Congresswoman Alexandria Ocasio-Cortez asked why such “world class” technology wasn’t funneled toward education, housing or health care. The contract with Boston Dynamics was terminated in April, four months ahead of schedule. (NPR)

Why does this matter? Issues surrounding US police conduct have been thrust into the spotlight after public outcry following the deaths of George Floyd, Breonna Taylor and several other Black Americans at the hands of police officers. Derek Chauvin was found guilty of the murder of George Floyd prompting the launch of a US Justice Department investigation into Minneapolis policing.

With confidence in the police at a record low in the US, autonomous units could be considered a potential solution by removing human involvement. The inclusion, however, of bias datasets used to train policing algorithms demonstrates that the human element remains, producing racist predictive responses that could have severe social repercussions if left unchecked.

Well-documented issue  Prominent figures within AI ethics have voiced their concerns about the application of AI by large organisations. Timnit Gebru was fired from Google for suggesting a willingness to ignore the full scope of AI’s social flaws. Despite the evidence, real-world applications of policing AI are already in use and producing biased results. To address this issue, alternative algorithmic training has been implemented in some cases, using victim reports instead of arrest data. Bias, however, still remained even when chosing this process.

Policing examples – Facial-recognition technology in New Jersey misidentified a Black man which led to false imprisonment for 10 days. Police, however, continue using AI and have integrated facial recognition with autonomous robots in California to detect blacklisted passersby. Their effectiveness with mundane policing tasks has also been questionable, with one citizen reporting that while attempting to report a brawl to a robot, it told her to “step away” before launching into song.

Allocation of funds – NYPD signed a $94,200 contract for the robot prompting debate about the use of public money. The unit was originally conceived for work on industrial sites and oil fields, not police work. Purchasing such an expensive device to monitor low-income communities when they are struggling the most during the Covid-19 pandemic will likely fuel calls to defund the police and re-direct money into communities.

This article first appeared in our weekly newsletter, Sustt.

Read more articles

Sign up to Sustt

Share This Post

You might also like

European Court rules climate inaction is a human rights violation

What’s happening? In a landmark decision, the European Court of Human Rights determined that Switzerland's inadequate efforts to reduce greenhouse ...

Read more

Tom Rejwan
April 25, 2024

Chimpanzee

Clean energy mineral mining threatens Africa’s Great Apes

What’s happening? Up to a third of great apes in Africa could be at risk due to mining for minerals ...

Read more

Claire Pickard
April 18, 2024

Solar Panels

Solar prices are plummeting amid Chinese ‘slave labour’ allegations

What’s happening? Alicia Kearns, Chair of the Foreign Affairs Select Committee, has warned that without stringent laws, Britain risks becoming ...

Read more

Tom Rejwan
April 12, 2024