1. George Gascon walks to podium with Alex Chohlas-Wood of Stanford Computational Policy Lab
2. SOUNDBITE (English) George Gascon, San Francisco District Attorney:
"So we wanted to create something that is above the human touch if you will, something that can actually filter the work to ensure that race is not going to play a role in our decision making process. And in order to do that we knew that we had to create artificial intelligence. We had to create a machine learning around this process to make sure that we could take race out our decision-making process."
3. San Francisco District Attorney logo
4. SOUNDBITE (English) George Gascon, San Francisco District Attorney:
"Alex and his group went out to create actually a system that could take the work coming from the police department - it will redact the work without redacting the essence and the quality of the narrative, which was so important to us, so that we could take a look first and make an initial charging decision based on the facts and the facts alone without any attention being paid to a person's race or age."
8. SOUNDBITE (English) George Gascon, San Francisco District Attorney:
"Frankly we hope that this will be creating a sea change of practices around the country. We believe that the ability to do this now will separate those that can't redact from those that won't. One of the beauty of this is that that Stanford has agree that this will be put out in the public arena at no cost to anyone."
In a first-of-its kind experiment, San Francisco prosecutors are turning to artificial intelligence to reduce racial bias in the courts, adopting a system that strips certain identifying details from police reports and leaves only key facts to govern charging decisions.
District Attorney George Gascon announced Wednesday that his office will begin using the technology in July to "take race out of the equation" when deciding whether to accuse suspects of a crime.
Criminal-justice experts say they have never heard of any project like it, and they applauded the idea as a creative, bold effort to make charging practices more colorblind.
Gascon's office worked with data scientists and engineers at the Stanford Computational Policy Lab to develop a system that takes electronic police reports and automatically removes a suspect's name, race and hair and eye colors.
The names of witnesses and police officers will also be removed, along with specific neighborhoods or districts that could indicate the race of those involved.
Gascon said his goal was to develop a model that could be used elsewhere, and the technology will be offered free to other prosecutors across the country.
The technology relies on humans to collect the initial facts, which can still be influenced by racial bias. Prosecutors will make an initial charging decision based on the redacted police report.
Then they will look at the entire report, with details restored, to see if there are any extenuating reasons to reconsider the first decision, Gascon said.
Experts said they look forward to seeing the results and that they expect the system to be a work in progress.
A 2017 study commissioned by the San Francisco district attorney found "substantial racial and ethnic disparities in criminal justice outcomes." African Americans represented only 6% of the county's population but accounted for 41% of arrests between 2008 and 2014.
The study found "little evidence of overt bias against any one race or ethnic group" among prosecutors who process criminal offenses.
But Gascon said he wanted to find a way to help eliminate an implicit bias that could be triggered by a suspect's race, an ethnic-sounding name or a crime-ridden neighborhood where they were arrested.
After it begins, the program will be reviewed weekly, said Maria Mckee, the DA's director of analytics and research.
The move comes after San Francisco last month became the first U.S. city to ban the use of facial recognition by police and other city agencies. The decision reflected a growing backlash against AI technology as cities seek to regulate surveillance by municipal agencies.