What an algorithm says is enough to arrest us. But the machines are also wrong. The New York Times describes the case of Robert Julian-Borchak Williams, who was arrested in front of his wife and children last January in the city of Detroit after being identified by a facial recognition algorithm. Except he really had nothing to do with who the police were looking for.

This is the first known case of a person wrongly arrested after being identified by facial recognition software. Williams was accused of stealing nearly $ 4,000 worth of watches from a store in October 2018. As a result, he was sent to the interrogation room and held for about 30 hours, before being released without charge.

A clear example of racial discrimination in algorithms

DataWorks Plus, the software used by the Michigan State Police, identified Williams after connecting the photo of his driver’s license to the images of the crime night surveillance camera.

But what for facial recognition was a case of a possible positive of an identified person, as NPR describes, for the police officers it was clear that they were not the same person. “The machine has got it wrong,” the police appear to have indicated after personally comparing the images with Williams himself.

“When I look at the person’s photo, I only see one large black. I don’t see any resemblance. I don’t think he resembles me at all. I hope he doesn’t think all blacks are the same,” explains Williams.

However, despite the software’s clear mistake, Williams was not immediately released and was held overnight with a $ 1,000 bond, according to the NYTimes.

DataWorks Plus software did not identify Williams as the direct culprit, it simply submitted information on a possible suspect. According to the police report, “it is only an investigation lead and is not a probable cause of arrest.” This process had an additional error, as the image was also shown to a security guard, who also misidentified the person.

Different experts in artificial intelligence have pointed out that facial recognition systems have a problem of racial discrimination due to lack of samples, the case of Williams being the first to have direct consequences.

In early June, IBM announced that it was ending the development and sale of facial recognition software. The IBM CEO was sending a letter warning that “it is time to start a national dialogue on whether facial recognition technology should be used by national law enforcement agencies.”

In response, through a statement, the Wayne County District Attorney’s Office has apologized, explaining that “it does not in any way compensate for the hours Mr. Williams spent in jail.” In this line, the accused will be able to see the case and his fingerprints erased from the database.

LEAVE A REPLY

Please enter your comment!
Please enter your name here