Facial recognition technology has always been the source of much debate related to privacy in the digital era. As many organizations and governments have adopted the tech for purposes like law enforcement or machine learning, some have started pushing back against it lately. Many cities have prohibited police departments from applying facial recognition tech, and now, the city of Los Angeles is following a kind of suit.
It has been alleged that the law enforcement organization is misusing software provided by highly-controversial facial recognition tech company Clearview to hunt down criminals. According to a report from Buzzfeed News, more than 25 LAPD employees performed almost 475 searches as of “earlier this year,” so officers have definitely obtained a decent amount of use out of the tech.
The issue with Clearview’s software is that it uses images and content scraped from social media websites to create a database of faces that can be exploited by its clients. That’s where the controversy comes in if, given the choice to approve this sort of scraping, most people would opt-out.
Being added to a database that can and will often be employed by law enforcement with no warning is an intimidating prospect. Artificial intelligence can make errors, after all, as it has been recently observed when an AI-powered sports camera operator mistook a lineman’s bald head for a soccer ball.
Advocating its first-amendment rights, Clearview has claimed in the past that it’s completely authorized to perform this sort of scraping, and social media platforms have no legal grounds to stop the process.
Legal or not, though, the LAPD appears to have had a change of heart. Henceforth, it will put an indefinite “moratorium” on the application of all commercial facial recognition software.
But the police department can still employ facial recognition technology, it just has to be in-house. As Buzzfeed News says, a new policy proposal will permit the LAPD to apply a Los Angeles County system that depends on suspect booking images. That’s still not a perfect situation for privacy proponents, we’re sure, but it’s definitely a step in the right direction.