San Francisco using AI to make charging decisions colourblind

This post was originally published on this site
Advertisements

Prosecutors in San Francisco are to use artificial intelligence to try to reduce racial bias when considering whether to charge suspects with a crime.

In a world first, District Attorney George Gascon said he hoped the technology would “take race out of the equation” in the courts.

Mr Gascon’s office worked with data scientists and engineers at the Stanford Computational Policy Lab to develop a system that takes electronic police reports and automatically removes a suspect’s name, race and hair and eye colours.

He said the process would “redact the work without redacting the essence and the quality of the narrative, which was so important to us, so that we could take a look first and make an initial charging decision based on the facts and the facts alone without any attention being paid to a person’s race or age”.

Image: District Attorney George Gascon said he wanted to make the justice system colourblind

The names of witnesses and police officers will also be removed, along with specific neighbourhoods or districts that could indicate the race of those involved.

A decision on whether or not to charge is made on the basis of these redacted police reports. The reports are then fully restored and re-evaluated to see if there is any reason to reconsider the original decision.

Mr Gascon said he wanted to find a way to help eliminate an implicit bias that could be triggered by a suspect’s race, an ethnic-sounding name or a crime-ridden area where they were arrested – in essence, to make justice colourblind.

More from Science & Tech

Any details that might identify race are removed from the initial incident report while charges are considered

Image: Any details that might identify race are removed from the initial incident report while charges are considered

The programme will begin in July and progress will be reviewed weekly. Stanford has agreed to put the technology in the public arena for free, so if successful in San Francisco it could be rolled out across the country.

The scheme follows a 2017 study by the San Francisco district attorney which found “substantial racial and ethnic disparities in criminal justice outcomes”.

African Americans represented only 6% of the county’s population but accounted for 41% of arrests between 2008 and 2014.

However it found “little evidence of overt bias against any one race or ethnic group” among prosecutors who process criminal offences.

'Racial bias' in criminal justice system exposed

‘Racial bias’ in criminal justice system exposed

“Deferred” prosecutions could be offered to criminals under reforms put forward as part of a landmark inquiry

In the UK, a 2017 review carried out by MP David Lammy found there was evidence of “bias” and “overt discrimination” within parts of the justice system.

It showed that BAME men and women made up just 14% of the general population of England and Wales, while behind bars they accounted for 25% of prisoners.

In one area of offending, drugs, it was revealed that BAME people were 240% more likely to be sent to prison than white offenders.

Advertisements

Login, Join the discussion, Post a comment, Earn points and win prizes!!

Please Login to comment
  Subscribe  
Notify of
Share via