fbpx

Autonomous police units could deepen racial inequality

What’s happening? The New York Police Department (NYPD) has cancelled a contract worth around $94,000 with Boston Dynamics for its robot Digidog after public criticism over the high cost and fears of police militarisation. The 32 kg Spot robot was unveiled in December to “save lives, protect people, and protect officers”, according to the NYPD. NY Mayor Bill de Blasio dubbed it “creepy”, while Congresswoman Alexandria Ocasio-Cortez asked why such “world class” technology wasn’t funneled toward education, housing or health care. The contract with Boston Dynamics was terminated in April, four months ahead of schedule. (NPR)

Why does this matter? Issues surrounding US police conduct have been thrust into the spotlight after public outcry following the deaths of George Floyd, Breonna Taylor and several other Black Americans at the hands of police officers. Derek Chauvin was found guilty of the murder of George Floyd prompting the launch of a US Justice Department investigation into Minneapolis policing.

With confidence in the police at a record low in the US, autonomous units could be considered a potential solution by removing human involvement. The inclusion, however, of bias datasets used to train policing algorithms demonstrates that the human element remains, producing racist predictive responses that could have severe social repercussions if left unchecked.

Well-documented issue  Prominent figures within AI ethics have voiced their concerns about the application of AI by large organisations. Timnit Gebru was fired from Google for suggesting a willingness to ignore the full scope of AI’s social flaws. Despite the evidence, real-world applications of policing AI are already in use and producing biased results. To address this issue, alternative algorithmic training has been implemented in some cases, using victim reports instead of arrest data. Bias, however, still remained even when chosing this process.

Policing examples – Facial-recognition technology in New Jersey misidentified a Black man which led to false imprisonment for 10 days. Police, however, continue using AI and have integrated facial recognition with autonomous robots in California to detect blacklisted passersby. Their effectiveness with mundane policing tasks has also been questionable, with one citizen reporting that while attempting to report a brawl to a robot, it told her to “step away” before launching into song.

Allocation of funds – NYPD signed a $94,200 contract for the robot prompting debate about the use of public money. The unit was originally conceived for work on industrial sites and oil fields, not police work. Purchasing such an expensive device to monitor low-income communities when they are struggling the most during the Covid-19 pandemic will likely fuel calls to defund the police and re-direct money into communities.

https://curationcorp.com/wp-content/uploads/2021/01/fred2.jpg

Fred Fullerton

Sustainability Curator

Sign up to Sustt

Sign up to Sustt+

You Might Also Like

Is there a green bubble in the stock market?

What’s happening? Data from Bank of America showed investments into ESG

Mubaasil Hassan

11th June 2021

Burger King’s stale LGBTQ+ commitment

What’s happening? Burger King will donate to the Human Rights Campaign for

Sara Trett

11th June 2021

If we want to fix the climate, we need to fix Covid

What’s happening? A deal to limit global temperature rises to under 1.5C will

Katie Chan

11th June 2021