Home Tech US police used Clearview AI almost 1 million times, company reveals to BBC

US police used Clearview AI almost 1 million times, company reveals to BBC

Facial recognition firm Clearview has run nearly a million searches for US police, its founder has told the BBC

by THE GULF TALK

Clearview AI, a facial recognition company that has been repeatedly fined for privacy breaches in Europe and Australia, has reportedly collected over 30 billion images taken without users’ permissions from platforms such as Facebook. The company’s technology allows law enforcement agencies to upload a photo of a face and find matches in a database of billions of images. The resulting links provide access to where the matching images appear online. Clearview AI is considered one of the most powerful and accurate facial recognition companies in the world.

Critics argue that the use of Clearview puts everyone into a “perpetual police line-up”. Whenever the police have a photo of a suspect, they compare it to people’s faces, which they consider far too invasive. While Miami Police has confirmed to the BBC that it uses this software for every type of crime, there are almost no laws around the use of facial recognition by police. Critics are calling for police forces that use Clearview to openly say when it is used and for its accuracy to be openly tested in court. They want the algorithm scrutinized by independent experts and are skeptical of the company’s claims.

Civil rights campaigners are concerned about the company’s almost unrestricted access to people’s images without their consent. Kaitlin Jackson, a criminal defense lawyer based in New York, is campaigning against the police’s use of facial recognition. She says that there is no way to know that the technology is incredibly accurate when using images in the wild like screen grabs from CCTV. Clearview often points to research that shows it has a near 100% accuracy rate. However, these figures are often based on mugshots. In reality, the accuracy of Clearview depends on the quality of the image that is fed into it, which Hoan Ton-That, CEO of Clearview AI, accepts.

The lack of data and transparency around police use means the true figure of mistaken identity cases using facial recognition technology is likely far higher. Mr. Ton-That accepts that police have made wrongful arrests using facial recognition technology but attributes those to “poor policing.” While Clearview’s system is banned from selling its services to most US companies, there is an exemption for police. Mr. Ton-That says that his software is used by hundreds of police forces across the US.

Police in the US do not routinely reveal whether they use the software, and it is banned in several US cities, including Portland, San Francisco, and Seattle. The use of facial recognition by the police is often sold to the public as only being used for serious or violent crimes. However, in a rare interview with law enforcement about the effectiveness of Clearview, Miami Police said that they used the software for every type of crime, from murders to shoplifting.

Assistant Chief of Police Armando Aguilar said his team used the system about 450 times a year, and that it had helped solve several murders. However, Mr. Aguilar says that Miami police treats facial recognition like a tip. “We don’t make an arrest because an algorithm tells us to,” he says. “We either put that name in a photographic line-up or we go about solving the case through traditional means.”

Although there have been cases where Clearview is proven to have worked, such as Andrew Conlyn’s case in Florida, some believe it comes at too high a price. Mr. Conlyn had charges against him dropped after Clearview was used to find a crucial witness. However, critics argue that Clearview AI is a private company that is making face prints of people based on their photos online without their consent. They believe it is a huge problem for civil liberties and civil rights and that it absolutely needs to be banned.

Related Articles

This website uses cookies to improve your experience. We'll assume you're ok with this, but you can opt-out if you wish. Accept Read More