As security cameras proliferated, so did the widespread assumption that almost everywhere we go, we’re being watched or recorded on video. Now, software that recognizes and identifies faces is turning cameras into cops. And maybe the wisest assumption today is that we’re not only being watched – we’re possibly being identified, too.
In some forms, facial recognition technology is a consumer convenience. For example, owners can open newer smartphones just by showing their faces to the screens. Facial recognition features on video doorbells and home security cameras can reduce false alarms by distinguishing friendly faces from unknown ones. Travelers can breeze through airport security checkpoints by submitting to facial recognition screening, rather than handing over a driver’s license or passport.
Consent is central to the concerns of the ACLU and other privacy advocates worried about the downside of facial recognition technology. When police try to match photos of unidentified suspects with photos of identified individuals in a database, they do so without the consent of the suspects. But at airport security checkpoints, airline passengers can opt out of facial recognition screening and show an ID card instead. “It is very important for people to know that is voluntary,” said Nathan Freed Wessler, deputy director of the Speech, Privacy, and Technology Project at the American Civil Liberties Union (ACLU). “You can absolutely opt out at the TSA checkpoints with facial recognition and have them look at your ID the way they have always done. That’s what I do.”
The Transportation Security Administration says it disposes of passenger photos that agents take during facial recognition screening. But countless companies with facial recognition features on their security cameras are far less transparent than the TSA and have raised privacy concerns. “We just don’t know how many private companies are deploying this technology on top of their store surveillance cameras, because in most states in the country, there’s no requirement that they publicly disclose it, and no requirement of consent,” Wessler said. “Those kinds of legal requirements can make a big difference.”
Even a humble corner store can expose the public to the downside of technology designed to identify people from their faces alone. In December 2023, the Rite Aid drugstore company stopped using facial recognition technology in store surveillance for five years to settle charges of misuse by the Federal Trade Commission. The FTC charged Rite Aid with failing to take reasonable measures to prevent harm to affected consumers, whom store employees misidentified as a shoplifter or troublemaker by using facial recognition technology. “Employees, acting on false positives, followed consumers around its stores, searched them, ordered them to leave, called the police to confront or remove consumers, and publicly accused them, sometimes in front of friends or family, of shoplifting or other wrongdoing,” according to an FTC summary of its complaint against Rite Aid.
“Rite Aid is not the only company to do this, but they’re the ones who got in trouble first,” said Wessler of the ACLU. “They had widely deployed facial recognition cameras across their stores, without notifying people, purportedly to try to identify suspected shoplifters and other people they wanted to bar from their premises.”
The bad news is Rite Aid’s abuse of facial recognition technology may be more common than anyone realizes. “It only came to light because of excellent reporting by Reuters journalists,” Wessler said.
Resistance to facial recognition technology also extends to the TSA’s use of second-generation Credential Authentication Technology (CAT-2) scanners for airport security screening. The facial recognition technology helps TSA agents ensure the person standing at the checkpoint is the same person pictured on the driver’s license or passport they present. Studies show that this technology outperforms humans in confirming whether the faces on licenses and passports match the faces of travelers who present such credentials at airport checkpoints.
Fearful of intrusions on personal privacy, some federal lawmakers want to ban facial recognition technology at airport checkpoints. In November 2023, a group of legislators led by Republican Sen. John Kennedy and Democratic Sen. Jeff Merkley introduced the Traveler Privacy Protection Act, which would require the TSA to stop using facial recognition technology and to get congressional approval before resuming its use of the technology. The legislation was proposed as an amendment to a reauthorization bill for the Federal Aviation Administration, but in May, the Senate passed the bill without voting on the proposed amendment. “It became abundantly clear that the end goal for TSA is to make facial recognition mandatory for all American air travelers and that the current opt-out system will end,” Senator Merkley said In a May 10 statement. “I will keep working to safeguard Americans’ right to privacy and protect against the dangers of a national surveillance state.”
A surveillance state briefly emerged this year in Lakeland, Florida, before the threat of litigation ended it. In March, a local government agency called the Lakeland Downtown Development Authority (DDA) arranged for the installation of 14 security cameras with facial recognition software at downtown properties, with permission from the property owners, according to the Lakeland Ledger. The local newspaper also reported that the software would alert police if the cameras spotted “people of interest,” including individuals who had threatened DDA staff. But when the American Civil Liberties Union threatened to sue to force a deactivation of the facial recognition software, the DDA deactivated the camera software just two weeks after the cameras were installed. In a joint statement on March 29, the American Civil Liberties Union and its Florida arm denounced the Lakeland camera network as “invasive biometric surveillance.”
More than 50 cameras in downtown West Palm will record video of their surroundings in the next phase of the city’s Mobility Intelligence Project, which is intended to help improve the pedestrian experience and the flow of vehicular traffic. But the anonymized video will not include identifiable images, such as faces and license plates.
“Recording and distributing live video in public spaces gives me (and others) some pause,” Jason Hallstrom, the director of the Institute for Sensing and Embedded Network Systems Engineering (I-Sense) and professor in the College of Engineering and Computer Science at Florida Atlantic University, said in an email exchange. He is also deputy director and chief research officer of the Engineering Research Center for Smart Streetscapes, the consortium of universities behind the West Palm Beach program. “However, the program … does not record live video. It records a background image (e.g., an empty street) with 2D avatars recorded in place of the objects detected by a camera (e.g., people, vehicles). People, vehicles and other privacy-sensitive objects are not blurred or obfuscated – they’re never recorded at all.”
Public resistance to large-scale surveillance and facial recognition has precluded its spread in the United States, said Wessler of the ACLU. “This technology is being used in many countries around the world — in China, Russia, Israel and elsewhere — so it could be turned on. I think the risk of massive public backlash is all that’s keeping police from doing it right now,” he said. “By and large across the United States, that’s been a red line that basically no jurisdiction has crossed at scale. That’s because of the extraordinary privacy concerns with suddenly being able to instantaneously identify everyone walking around, and track where we go.”
The ACLU sees a bigger problem with the everyday application of facial recognition technology to police investigations. Law enforcement agencies work with Manhattan-based Clearview AI, which has a photo database of more than 30 billion faces, and with other companies to identify unknown suspects, often from security-camera images of them. “That’s the service Clearview provides, and there are other companies that also provide this technology to police, although all the other companies require police to bring their own matching database. That’s usually either mug shots or state driver’s license photos,” Wessler said. “We are deeply concerned with that use [of facial recognition technology] because of the demonstrated track record of this technology spitting out false matches, leading to wrongful arrests.”
Broward Sheriff’s Office detectives don’t rely solely on facial recognition technology to arrest subjects in investigations, Carey Codd, BSO senior public information officer, said in an email. “The results of facial recognition searches serve as investigative leads used to assist with identifying a subject in an investigation,” he said. “Arrests are never made solely based on information gathered from these searches.”
Beyond law enforcement, facial recognition technology is increasingly pervasive in everyday civilian life. Consider the corporate customers of Seattle-based Vouched, which sells its identity verification service to about 250 companies in a mix of economic sectors ranging from financial services and health care to automotive goods and hospitality services. Vouched verifies the identity of people online by asking them to electronically submit an image of their driver’s license and a self-made photo of themselves. Then the company verifies not only that the face in the “selfie” matches the face on the license, but also that the information on the license and its design are legitimate. The use of facial recognition technology is “about 10 percent of what we do,” said John Baird, founder, chairman and chief strategy officer of Vouched, which also looks for possible signs of criminal intent.
“Every single day, we help verify people for more than $40 million in loans, on average, entirely online,” Baird said. Before loan money is delivered to online applicants, they undergo the Vouched screening process. “Let’s say your driver’s license says you live in Fort Lauderdale. We can tell, while you’re applying for that loan, if you are in Fort Lauderdale, as you claim. If it looks like you’re actually in Venezuela, we would tell the bank that there’s something sketchy about this individual.” In turn, the lender may request a face-to-face meeting with the loan applicant before lending the money, he said.
The public encounters a similar screening service by Vouched, including facial recognition, at check-in facilities like hospital admission offices and hotel front desks. Indeed, Baird said facial recognition technology will outlive its reputation as a frightening threat to privacy. “It’s just who we are as humans that whenever something new is created, our initial reaction is, we’re a little scared of it. And it is not unique to facial recognition,” he said. “Over time, of course, we adapt to new technologies, and they make our lives better. And I think facial recognition is part of that evolution, where it actually does much better than things like passwords alone, helps protect us and provides a greater layer of security.”