Internet Policy Observatory affiliate Christian Möller discusses the May 13th open letter to Google signed by eighty scholars. The letter asks Google for more transparency on how it processes ‘right to be forgotten’ (RTBF) requests. To view the complete letter click here.
A year ago, in March 2014, the European Court of Justice (ECJ) ruled that Google and other search engines function as ‘controllers’ of personal information as laid out in the European Data Protection Directive (DPD). Responsibilities of those controllers include an obligation to keep data that provides identifying information about individuals for no longer than is necessary for the purposes for which the data were collected or for which they are further processed.
Search engines, according to the ECJ ruling, must remove links to outdated or irrelevant personal information from search results upon request. The Court found that individuals have a right to control their private data and that they have the right to request that information be ‘forgotten’ when the results show links to information that is no longer accurate or relevant. It also established that Google’s search engine results are fully subject to European data privacy law.
In an initial reaction after the ruling, Google called the judgment ‘disappointing,’ saying that it ‘went too far’. Following the ECJ ruling, however, the company followed the Court decision and has set up a form to submit removal requests. On the day of form’s release, Google reportedly received more than 12,000 removal requests. To date, Google says that it has received more than 250,000 requests to delist links and has evaluated more than 930,000 URLS for removal. According to a Google transparency report, 41 percent of those URLs have been removed and 59 percent have not been removed. That sums up to roughly 380,000 URLs removed from Google search results because of RTBF requests.
At the same time, search engines are still struggling with the question of how to react to the Court ruling. Google has established an Advisory Council to help handle requests from Europeans claiming their ‘right to be forgotten’ and has published a transparence report and answers to frequently asked questions on the topic.
The Advisory Council published a report in January 2015 after having held seven meetings in Europe during 2014. Recordings of the meetings are available online. In their report, the Council recommends the search engine to be “as transparent as it is possible within the legal limits and the protection of the data subjects’ privacy, e.g. through anonymized and aggregated statistics and references to adopted policies. […] Search engines should also be transparent with the public about the process and criteria used to evaluate delisting requests.”
Some experts also suggest that Google is also responsible for providing a detailed explanation of its decisions, and that it is important that Google makes its guidelines on the kinds of requests likely to be honored publicly available so individuals can weigh the benefits of submitting a RTBFrequest.
Google also contributed to the Article 29 Working Party which has an advisory status in the implementation of the European DPD, saying that the company was aware of the “tough debates” that lie ahead and that is was important to have those debates openly and respectfully.
In light of these activities, signatories of the May 13th open letter criticize that Google only makes available anecdotal evidence of the removal process and are asking what sort of information typically gets delisted and what sort typically does not, in what proportions and in what countries.
“The vast majority of these decisions face no public scrutiny, though they shape public discourse. What’s more, the values at work in this process will inform information policy around the world. A fact-free debate about the RTBF is in no one’s interest,” the letter reads. And continues:
“Google is not the only search engine, but no other private entity or Data Protection Authority has processed anywhere near the same number of requests. Google has by far the best data on the kinds of requests being made, the most developed guidelines for handling them, and the most say in balancing informational privacy with access in search. We address this letter to Google, but the request goes out to all search engines subject to the ruling.”
The undersigned have a range of views about the merits of the ruling (some think it rightfully vindicates individual data protection and privacy interests, others think it unduly burdens freedom of expression and information retrieval) they write:
“We all believe that implementation of the ruling should be much more transparent for at least two reasons: (1) the public should be able to find out how digital platforms exercise their tremendous power over readily accessible information; and (2) implementation of the ruling will affect the future of the RTBF in Europe and elsewhere, and will more generally inform global efforts to accommodate privacy rights with other interests in data flows.”
The letter ends with thirteen points the signatories ask from Google, including categories of RTBF requests that are presumptively accepted and presumptively not accepted, a breakdown of overall requests by category, reasons for denial of delisting or the source (e.g., professional media, social media, official public records) of material for delisted URLs by per cent and nation of origin.
The open letter initiative by Ellen Goodman of Rutgers University School of Law and Julia Powles of the University of Cambridge Faculty of Law has been supported by The Guardian.
Read the complete letter at https://medium.com/@ellgood/open-letter-to-google-from-80-internet-scholars-release-rtbf-compliance-data-cbfc6d59f1bd and follow the discussion on Twitter under #RTBF and #RTBFdata.
A purported list of all sites delisted from Google’s European search results is available at Hidden from Google.
Disclosure: The author, IPO Affiliate and ASC Fall 2014 Visiting Scholar Christian Moeller, is one of the signatories of the letter.
 Publicly available materials should also, to certain extent, anonymize statistics about decisions to protect data subjects.
Christian Möller, M.A., (@infsocblog) is lecturer at the University of Applied Sciences in Kiel, Germany. His main areas of work span from internet governance, international media regulation and human rights to social media and journalism in the digital age. He also regularly serves as a consultant (theinformationsociety.org) and conference speaker for various national and international corporations and institutions, e.g. the Organization for Security and Co-operation in Europe (OSCE), the Council of Europe, the OSCE Project Co-ordinator in Ukraine, or the Chamber of Commerce in Kiel, Germany.