Summary

Total Articles Found: 2

Top sources:

Top Keywords:

Top Authors

Top Articles:

  • When Facial Recognition Is Used to Identify Defendants, They Have a Right to Obtain Information About the Algorithms Used on Them, EFF Tells Court
  • Media Alert: EFF Argues Against Forced Unlocking of Phone in Indiana Supreme Court

Media Alert: EFF Argues Against Forced Unlocking of Phone in Indiana Supreme Court

Published: 2019-04-16 18:17:15

Popularity: 32

Author: Karen Gullo

🤖: "Privacy shielded"

Justices to Consider Fifth Amendment Right Against Self-Incrimination Wabash, IN—At 10 a.m. on Thursday, April 18, the Electronic Frontier Foundation (EFF) will argue to the Indiana Supreme Court that police cannot force a criminal suspect to turn over a passcode or otherwise decrypt her cell phone. The case is Katelin Seo v. State of Indiana.The Fifth Amendment of the Constitution states that people cannot be forced to incriminate themselves, and it’s well settled that this privilege against self-incrimination covers compelled “testimonial” communications, including physical acts. However, courts have split over how to apply the Fifth Amendment to compelled decryption of encrypted devices.Along with the ACLU, EFF responded to an open invitation from the Indiana Supreme Court to file an amicus brief in this important case. In Thursday’s hearing, EFF Senior Staff Attorney Andrew Crocker will explain that the forced unlocking of a device requires someone to disclose “the contents of his own mind.” That is analogous to written or oral testimony, and is therefore protected under the U.S. Constitution.Thursday’s hearing is in Indiana’s Wabash County to give the public an opportunity to observe the work of the court. Over 750 students are scheduled to attend the argument. It will also be live-streamed.WHAT:Hearing in Katelin Seo v. State of IndianaWHO:EFF Senior Staff Attorney Andrew CrockerWHEN:April 18, 10 a.m.WHERE:Ford TheaterHoneywell Center275 W. Market StreetWabash, Indiana 46992 For more information on attending the argument in Wabash:https://www.in.gov/judiciary/supreme/2572.htmFor more on this case:https://www.eff.org/deeplinks/2019/02/highest-court-indiana-set-decide-if-you-can-be-forced-unlock-your-phone Contact:  Andrew Crocker Senior Staff Attorney andrew@eff.org

...more

When Facial Recognition Is Used to Identify Defendants, They Have a Right to Obtain Information About the Algorithms Used on Them, EFF Tells Court

Published: 2019-03-12 16:22:40

Popularity: 37

Author: Karen Gullo

Keywords:

  • Privacy
  • Biometrics
  • We urged the Florida Supreme Court yesterday to review a closely-watched lawsuit to clarify the due process rights of defendants identified by facial recognition algorithms used by law enforcement. Specifically, we told the court that when facial recognition is secretly used on people later charged with a crime, those people have a right to obtain information about how the error-prone technology functions and whether it produced other matches.EFF, ACLU, Georgetown Law’s Center on Privacy & Technology, and Innocence Project filed an amicus brief in support of the defendant’s petition for review in Willie Allen Lynch v. State of Florida. Prosecutors in the case didn’t disclose information about how the algorithm worked, that it produced other matches that were never considered, or why Lynch’s photo was targeted as the best match. This information qualifies as “Brady” material—evidence that might exonerate the defendant—and should have been turned over to Lynch.We have written extensively about how facial recognition systems are prone to error and produce false positives, especially when the algorithms are used on African Americans, like the defendant in this case. Researchers at the FBI, MIT, and ProPublica have reported that facial recognition algorithms misidentify black people, young people, and women at higher rates that white people, the elderly, and men.Facial recognition is increasingly being used by law enforcement agencies around the country to identify suspects. It’s unfathomable that technology that could help to put someone in prison is used mostly without question or oversight. In Lynch’s case, facial recognition could help to send him to prison for eight years.Undercover police photographed Lynch using an older-model cell phone at an oblique angle while he was in motion. The photo, which is blurred in places, was run through a facial recognition algorithm to see whether it matched any images of a database of county booking photos. The program returned a list of four possible matches, the first of which was Lynch’s from a previous arrest. His photo was the only one sent on to prosecutors, along with his criminal records.The algorithm used on Lynch is part of the Face Analysis Comparison Examination Systems (FACES), a program operated by the Pinellas County Sheriff’s Office and made available to law enforcement agencies throughout the state. The system can search over 33 million faces from drivers’ licenses and police photos. It doesn’t produce “yes” or “no” responses to matches; it rates matches as likely or less likely matches. Error rates in systems like this can be significant and the condition of Lynch’s photo only exacerbates the possibility of errors.FACES is poorly regulated and shrouded in secrecy. The sheriff said that his office doesn’t audit the system, and there’s no written policy governing its use. The sheriff’s office said it hadn’t been able to validate the system, and “cannot speak to the algorithms and the process by which a match is made.”That he was identified by a facial recognition algorithm wasn’t known by Lynch until just days before his final pretrial hearing, although prosecutors had known for months. Prior to that, prosecutors had never disclosed information about the algorithm to Lynch, including that it produced other possible matches. Neither the crime analyst who operated the system or the detective who accepted the analyst’s conclusion that Lynch’s face was a match knew how the algorithm functioned. The analyst said the first-listed photo in the search results is not necessarily the best match—it could be one further down the list. An Assistant State Attorney doubted the system was reliable enough to meet standards used by courts to assess the credibility of scientific testimony and whether it should be used at trial. Lynch asked for the other matches produced by FACES—the court refused.If a human witness who identified Lynch in a line-up said others in the line-up also looked like the criminal, the state would have had to disclose that information, and Lynch could have investigated those alternate leads. The same principle should have required the state to disclose other people the algorithm produced as matches and information about how the algorithm functions, EFF and ACLU told the Florida Supreme Court.When defendants are facing lengthy prison sentences or even the death penalty, tight controls on the use of facial recognition are crucial. Defendants have a due process right to information about the algorithms used and search results.  The Florida Supreme Court should accept this case for review and provide guidance to law enforcement who use facial recognition to arrest, charge, and deprive people of their liberty. Related Cases:  FBI Facial Recognition Documents

    ...more

    end