Summary

Total Articles Found: 2

Top sources:

Top Keywords:

Top Authors

Top Articles:

  • Skip the Surveillance By Opting Out of Face Recognition At Airports
  • When Facial Recognition Is Used to Identify Defendants, They Have a Right to Obtain Information About the Algorithms Used on Them, EFF Tells Court

Skip the Surveillance By Opting Out of Face Recognition At Airports

Published: 2019-04-25 04:38:56

Popularity: 6724

Author: Jason Kelley

Keywords:

  • Commentary
  • Biometrics
  • Government agencies and airlines have ignored years of warnings from privacy groups and Senators that using face recognition technology on travelers would massively violate their privacy. Now, the passengers are in revolt as well, and they’re demanding answers. Last week, a lengthy exchange on Twitter between a traveler who was concerned about her privacy and a spokesperson for the airline JetBlue went viral, and many of the questions asked by the traveler and others were the same ones that we’ve posed to Customs and Border Protection (CBP) officials: Where did you get my data? How is it protected? Which airports will use this? Where in the airports will it be used? Most importantly, how do I opt-out? Right now, the key to opting out of face recognition is to be vigilant. How to Opt Out These questions should be simple to answer, but we haven’t gotten simple answers. When we asked CBP for more information, they told us: “visit our website.” We did, and we still have many of the same questions. Representatives for airlines, which partner directly with the government agencies, also seem unable to answer the concerns, as the JetBlue spokesperson made evident. Both agencies and airlines seemed to expect no pushback from passengers when they implemented this boarding-pass-replacing-panopticon. The convenience would win out, they seemed to assume, not expecting people to mind having their face scanned “the same way you unlock your phone.” But now that “your face is your boarding pass” (as JetBlue awkwardly puts it), at least in some airports, the invasive nature of the system is much more clear, and travelers are understandably upset. It might sound trite, but right now, the key to opting out of face recognition is to be vigilant. There’s no single box you can check, and importantly, it may not be possible for non-U.S. persons to opt out of face recognition entirely. For those who can opt out, you’ll need to spot the surveillance when it’s happening. To start, TSA PreCheck, Clear, and other ways of "skipping the line" often require biometric identification, and are often being used as test cases for these sorts of programs. Once you’re at the airport, be on the lookout for any time a TSA, CBP, or airline employee asks you to look into a device, or when there’s a kiosk or signage like those below. That means your biometric data is probably about to be scanned. At the moment, face recognition is most likely to happen at specific airports, including Atlanta, Chicago, Seattle, San Francisco, Las Vegas, Los Angeles, Washington (Dulles and Reagan), Boston, Fort Lauderdale, Houston Hobby, Dallas/Fort Worth, JFK, Miami, San Jose, Orlando, and Detroit; while flying on Delta, JetBlue, Lufthansa, British Airways and American Airlines; and in particular, on international flights. But, that doesn’t mean that other airlines and airports won’t implement it sooner rather than later. To skip the surveillance, CBP says you “should notify a CBP Officer or an airline or airport representative in order to seek an alternative means of verifying [your] identity and documents.” Do the same when you encounter this with an airline. While there should be signage near the face recognition area, it may not be clear. If you’re concerned about creating a slight delay for yourself or other passengers, take note: though CBP has claimed to have a 98% accuracy rating in their pilot programs, the Office of the Inspector General could not verify those numbers, and even a 2% error rate would cause thousands of people to be misidentified every day. Most face recognition technology has significantly lower accuracy ratings than that, so you might actually be speeding things up by skipping the surveillance. The Long And Winding Biometric Pathway Part of the reason for the confusion about how to opt out is that there are actually (at least) three different face recognition checkpoints looming: Airlines want to use your face as your boarding pass, saying “it's about convenience.” CBP, which is part of the Department of Homeland Security (DHS), wants to use your face to check against DHS and State Department databases when you’re entering or exiting the country; and the TSA wants to compare your face against your photo identification throughout the airport. And if people are upset now, they will be furious to know this is just the beginning of the “biometric pathway” program: CBP and TSA want to use face recognition and other biometric data to track everyone from check-in, through security, into airport lounges, and onto flights (PDF). They’re moving fast, too, despite (or perhaps because of) the fact that there are no regulations on this sort of technology: DHS is hoping to use facial recognition on 97 percent of departing air passengers within the next four years and 100 percent of all international passengers in the top 20 U.S. airports by 2021. It’s the customers and passengers who will bear the burden when things go wrong, If the government agencies get their way, new biometric data could be taken from/used against travelers wherever they are in the airport—and much of that collection will be implemented by private companies (even rental car companies are getting in on the action). CBP will store that facial recognition data for two weeks for U.S. citizens and lawful permanent residents, and for 75+ years for non-U.S. persons. In addition, the biometric data collected by at least some of these systems in the future—which can include your fingerprints, the image of your face, and the scan of your iris—will be stored in FBI and DHS databases and will be searched again and again for immigration, law enforcement, and intelligence checks, including checks against latent prints associated with unsolved crimes. Passengers Will Bear the Burden of Privacy Invasion, Not Airlines or Government Agencies It’s easy for companies and agencies to tout the convenience of this sort of massive data collection and sharing scheme. But as we’ve seen in notable privacy fiascos over the last few years—from Facebook’s Cambridge Analytica scandal, to the breaches of the Office of Personnel Management and Equifax in the U.S., to the constant hacking of India’s national biometric database, Aadhar—it’s the customers and passengers who will bear the burden when things go wrong, and they will go wrong. These vast biometric databases will create huge security and privacy risks, with the additional concern that a company leaking your passwords or credit card numbers is nothing compared to it leaking your biometric data. While you can change a password, you can’t easily change your face. Additionally, these systems are notoriously inaccurate, contain out-of-date information, and due to the fact that immigrants and people of color are disproportionately represented in criminal and immigration databases, and that face recognition systems are less capable of identifying people of color, women, and young people, the weight of these inaccuracies will fall disproportionately on them. It will be the passengers who bear the burden when they are stuck watching the flights they paid for take off without them because there was an error with a database or an algorithm, or because they preferred non-biometric options that weren’t in place. It’s time for the government agencies and the airlines to pause these programs until they can clearly and adequately give: Photographs of the signage in-situ in the airports in question, as well as any additional information about the opt-out process. An explanation of the locations where CBP will be providing meaningful and clear opt out notice to travelers (for example, at entry points, point-of-sale, ticket counters, security checkpoints, and boarding gates) as well as the specific language travelers can use to opt out of the biometric data collection program. An up-to-date list of all the airports and airlines that currently participate in the biometric exit program. Information about the algorithm CBP is using to compare photos (provided by NEC), as well as the accuracy information associated with that algorithm. Technological specifications for transferring data from point of collection to DHS and with vendors and airlines. Additional questions—like how data is safeguarded—are laid out in our letter to CBP. Congress must also demand the answers to these questions. And lawmakers must require agencies and airlines to pause this program until they can not only ensure the biometric privacy of travelers is protected but more importantly justify this huge invasion of privacy. Just last month, three Senators released a joint statement calling on DHS to pause the program until there can be “a rulemaking to establish privacy and security rules of the road,” but so far, they’ve been ignored. Trading privacy for convenience is a bad bargain, and it can feel like the deal isn’t always one we have a choice in. DHS has said that the only way we can ensure that our biometric data isn’t collected when we travel is to “refrain from traveling.” That’s ridiculous. The time to regulate and restrict the use of  facial recognition technology is now, before it becomes embedded in our everyday lives. We must keep fighting to make sure that in the future, it gets easier, and not harder, to defend our privacy—biometric or otherwise.

    ...more

    When Facial Recognition Is Used to Identify Defendants, They Have a Right to Obtain Information About the Algorithms Used on Them, EFF Tells Court

    Published: 2019-03-12 16:22:40

    Popularity: 37

    Author: Karen Gullo

    Keywords:

  • Privacy
  • Biometrics
  • We urged the Florida Supreme Court yesterday to review a closely-watched lawsuit to clarify the due process rights of defendants identified by facial recognition algorithms used by law enforcement. Specifically, we told the court that when facial recognition is secretly used on people later charged with a crime, those people have a right to obtain information about how the error-prone technology functions and whether it produced other matches.EFF, ACLU, Georgetown Law’s Center on Privacy & Technology, and Innocence Project filed an amicus brief in support of the defendant’s petition for review in Willie Allen Lynch v. State of Florida. Prosecutors in the case didn’t disclose information about how the algorithm worked, that it produced other matches that were never considered, or why Lynch’s photo was targeted as the best match. This information qualifies as “Brady” material—evidence that might exonerate the defendant—and should have been turned over to Lynch.We have written extensively about how facial recognition systems are prone to error and produce false positives, especially when the algorithms are used on African Americans, like the defendant in this case. Researchers at the FBI, MIT, and ProPublica have reported that facial recognition algorithms misidentify black people, young people, and women at higher rates that white people, the elderly, and men.Facial recognition is increasingly being used by law enforcement agencies around the country to identify suspects. It’s unfathomable that technology that could help to put someone in prison is used mostly without question or oversight. In Lynch’s case, facial recognition could help to send him to prison for eight years.Undercover police photographed Lynch using an older-model cell phone at an oblique angle while he was in motion. The photo, which is blurred in places, was run through a facial recognition algorithm to see whether it matched any images of a database of county booking photos. The program returned a list of four possible matches, the first of which was Lynch’s from a previous arrest. His photo was the only one sent on to prosecutors, along with his criminal records.The algorithm used on Lynch is part of the Face Analysis Comparison Examination Systems (FACES), a program operated by the Pinellas County Sheriff’s Office and made available to law enforcement agencies throughout the state. The system can search over 33 million faces from drivers’ licenses and police photos. It doesn’t produce “yes” or “no” responses to matches; it rates matches as likely or less likely matches. Error rates in systems like this can be significant and the condition of Lynch’s photo only exacerbates the possibility of errors.FACES is poorly regulated and shrouded in secrecy. The sheriff said that his office doesn’t audit the system, and there’s no written policy governing its use. The sheriff’s office said it hadn’t been able to validate the system, and “cannot speak to the algorithms and the process by which a match is made.”That he was identified by a facial recognition algorithm wasn’t known by Lynch until just days before his final pretrial hearing, although prosecutors had known for months. Prior to that, prosecutors had never disclosed information about the algorithm to Lynch, including that it produced other possible matches. Neither the crime analyst who operated the system or the detective who accepted the analyst’s conclusion that Lynch’s face was a match knew how the algorithm functioned. The analyst said the first-listed photo in the search results is not necessarily the best match—it could be one further down the list. An Assistant State Attorney doubted the system was reliable enough to meet standards used by courts to assess the credibility of scientific testimony and whether it should be used at trial. Lynch asked for the other matches produced by FACES—the court refused.If a human witness who identified Lynch in a line-up said others in the line-up also looked like the criminal, the state would have had to disclose that information, and Lynch could have investigated those alternate leads. The same principle should have required the state to disclose other people the algorithm produced as matches and information about how the algorithm functions, EFF and ACLU told the Florida Supreme Court.When defendants are facing lengthy prison sentences or even the death penalty, tight controls on the use of facial recognition are crucial. Defendants have a due process right to information about the algorithms used and search results.  The Florida Supreme Court should accept this case for review and provide guidance to law enforcement who use facial recognition to arrest, charge, and deprive people of their liberty. Related Cases:  FBI Facial Recognition Documents

    ...more

    end