The future of airport security is getting more high-tech as the TSA continues testing a system that scans your face during the screening process.
The pilot program with facial recognition technology is running at several airports nationwide.
An agency official said the technology is only used at podiums manned by a TSA worker to match a passengers’ identification information “with a photo of their physical presence at that station.” The agency believes this will enhance security and cut wait times.
If the technology thinks you’re someone else, a TSA official said a security worker would manually check your ID.
“To me just invites, further law enforcement scrutiny invites further contact with authorities in ways that are not going to be helpful,” said Vincent Southerland, NYU Assistant Professor of Clinical Law.
Southerland helps run the Center on Race, Inequality, and the Law at NYU. He believes an incorrect match could be problematic for minority travelers.
“Force the person to kind of have to prove or demonstrate their identity in weird ways that they otherwise would not have, and that is going to of course lead to conflict,” said Southerland. “And conflict often does not end very well for folks who have been traditionally marginalized, oppressed and targeted by law enforcement.”
It’s also happened before with facial recognition technology.
Take the case of Robert Williams - he’s a black man who the ACLU says was wrongfully arrested by Detroit police in 2020. The organization said facial recognition software incorrectly identified him as a shoplifting suspect. The ACLU says the charges were eventually dropped.
In some cases, this software compares your image to another one or an entire database of photos.
So what happens when travelers don’t look exactly like their ID?
A 2019 federal government study by the National Institute of Standards and Technology (NIST) echoes some of those concerns. It found that Asian and African Americans were up to 100 times more likely to be misidentified than white people, depending on the algorithm and the search.
“The bigger finding from the 2019 study was that the false positive rates where somebody else could use your passport or somebody else could access your phone vary much, much more widely,” said Patrick Grother with NIST.
Grother is one of the computer scientists who worked on this study. He said they evaluated hundreds of algorithms and reviewed the rate of false negative and positive matches.
“One of the mistakes they’ll make is a false negative mistake, where it doesn’t associate you as you, two photos of you. And it doesn’t, it doesn’t put them together,” said Grother. “The other mistake is where it takes two photos of me and somebody else and says it’s the same person. So that’s a false positive error.”
Grother said these results went directly to the developers.
“We continue to track the technology and whether it’s improving with respect to accuracy and with respect to these demographic effects,” said Grother.
TSA declined our request for an on-camera interview about these racial disparities within facial recognition technology, so the Washington News Bureau took those questions directly to the White House.
“The TSA continuously evaluates and improves upon technology and process to better protect the traveling public and ensure that screening is conducted in a manner that respects the dignity of each individual that is clearly a priority,” said White Press Secretary Karine Jean-Pierre. “DHS is working with NIST to access the performance of face recognition technology and reduce demographic differentials particularly as it relates to race and gender.”
A TSA official said the agency is also working within the NIST framework and following a methodology that is based “in scientific rigor to test a standard that accurately identifies passengers across race, gender, and ethnic differences.”
But Southerland still has concerns.
“By rolling these types of tools out when they have these problems that are baked into them, you’re almost normalizing the types of harms that we anticipate seeing,” he said.
Southerland said the price of these problems is high, especially if an incorrect match leads to a criminal accusation.
“You’re going to be separated from your family, separated from your loved ones, lose opportunities for housing, employment, mental health and regular health care,” said Southerland. “All the range of consequences that flow from criminal justice involvement are almost compounded by having separate technologies or tools in the hands of law enforcement.”
The TSA has not finalized when facial recognition will rollout nationwide. The agency says the new system is performing well, and found no major differences across gender, race and skin tone.
The TSA said any incorrect matches will be reported back to the agency for review.
In early February, Sen. Edward J. Markey (D-Mass.), and Senators Jeff Merkley (D-Ore.), Cory Booker (D-N.J.), Elizabeth Warren (D-Mass.), and Bernie Sanders (I-Vt.) raised concerns about this TSA system as well. The group sent a letter to Transportation Security Administration (TSA) Administrator David Pekoske to urge TSA to stop its deployment of this facial recognition technology.
The International Biometrics + Identity Association, which is the leading international trade group representing the identiﬁcation technology industry, weighed in on those comments. In a written statement, the association spokesperson said “for the top-performing algorithms from IBIA members, including those used by our government, demographic differentials were extremely small.”
TSA also sent the Washington News Bureau the following statement about the technology:
TSA is exploring the use of one to one and one to few facial identification to automate identity verification at airport checkpoints and modernize the screening experience for passengers. Biometric technology has the potential to enhance security effectiveness, improve operational efficiency, and yield a more streamlined passenger experience at the TSA checkpoint. TSA recognizes that biometric solutions must be highly usable for all passengers and operators, considering the diversity of the traveling public.
These pilots are entirely voluntary. While we are informed the preliminary results are encouraging, TSA continues to monitor these pilots to ensure there is no inherent bias in the technology.
©2023 Cox Media Group