Remembering the Right to Be Forgotten
On May 13, 2014, the Court of Justice of the European Union (ECJ), the European Union's (EU) highest court, confirmed that existing EU law already includes something, if in limited form, that privacy advocates and the European Commission have long sought: a digital right to be forgotten. Under the recent ECJ ruling, EU citizens may now contact Google and effectively demand the removal of certain links from Google's search results that they believe violate their privacy. The judgment expressly recognizes, for the first time, that search engine operators are subject to EU data privacy laws—even if their data processing is done on servers outside Europe. In addition, the judgment acknowledges an existing right under EU law for individuals to request the erasure of personal information that does not comply with data privacy requirements. Under this reasoning, the ECJ ruled that, in certain circumstances, search engine operators are required to remove links to web pages containing information about an individual, even if the information is lawfully published.
The far-reaching judgment creates significant issues not only for search engine operators, but also for international businesses that maintain data on individuals and have EU operations—whether or not their personal data processing occurs in the EU. More fundamentally, the ruling and a broader right to be forgotten raise important questions regarding how the Internet and personal data should be regulated. The ECJ decision is fundamentally opposed to the traditional U.S. approach to privacy, which places a far greater premium on freedom of speech and the “right to know.” As such, the ruling threatens to further divide the two largest trading partners in the world over data protection at a time when significant uncertainty remains over the future of cross border data flows. As search engine operators take steps to comply with the ruling, the decision also could push the Internet further toward fragmentation, potentially stifling Internet innovation and affecting the broader information economy.
The Right to be Forgotten: A Short History
In a speech at the January 2012 Innovation Conference Digital, Life, Design in Munich, Vivian Redding, Vice President of the European Commission, outlined the Commission's proposal to overhaul the EU's 1995 Data Protection Directive and, in doing so, create a sweeping new privacy right—the “right to be forgotten.” The intellectual roots of the right to be forgotten can be traced back to French law, which recognizes le droit a l'oubli, or “the right to oblivion.” Under this right, a convicted criminal who has served his or her time and has been rehabilitated may object to the publication of the facts of his or her conviction and incarceration. The right to be forgotten represents a modern and broader version of this concept that addresses an urgent problem in the digital age: how does one escape his or her past on the Internet, where personal information potentially lives forever in the cloud?
Although the EU has taken the lead on an Internet right to be forgotten, countries around the world, including even the United States, are beginning to tackle the issue of who controls personal content once it is posted to the Internet and the degree to which individuals can control their online reputations. European and American approaches to this issue, however, are diametrically opposed. Where the EU historically prioritizes individual privacy, even over the disclosure of factually accurate information, the United States places a far greater premium on First Amendment protections of freedom of speech.
To illustrate, in Europe, the online posting of information is considered to be processing of “data” that is owned by an individual data subject. The EU's Data Protection Directive imposes a host of requirements on such processing. Under a regulatory framework where an entity needs a purpose to gather personal information and may use it only for the duration of that purpose, it is not hard to imagine a requirement that particular information must be deleted under certain circumstances, including, for example, when the data is no longer necessary for the original purpose.
In contrast, in the United States, any information posted online is considered speech, including compiled information from a search engine. Any effort to delete such information other than by the original poster implicates the speech of search engines, and the First Amendment strongly protects such speech from any limitation. In addition, the Communications Decency Act and its safe harbor immunize Internet service providers from liability with respect to speech of websites.
Not surprisingly then, the EU's 2012 announcement of a proposed right to be forgotten provoked a significant backlash in the United States. This is not to say, however, that there is no support in the United States for a person's right to access his or her personal information that is held by an online entity. Shortly after the European Commission proposed a draft Data Protection Regulation that included a right to be forgotten, the White House issued its “Consumer Privacy Bill of Rights,” which aimed to give consumers increased access to and control over their online personal information. In addition, legislation was introduced in the House of Representatives that would provide for deletion of personal information from applications on mobile devices. Recently, California adopted a new law that gives minors the right to erase posts they have made to online sites such as Facebook and Twitter.
U.S citizens also are requesting deletion of online content, and all signs point to this increasing. Google's Annual Transparency Report reveals that requests from the United States for removal of content for privacy-related reasons actually outnumber those of the average EU country. There also have been a number of lawsuits brought against Google by individuals seeking removal or alteration of information in search results. Still, for the reasons set forth above, there is little recourse under U.S. law for an individual seeking to challenge results posted by a search engine. This is no longer true for EU citizens after the ECJ's landmark ruling.
Google v. AEPD: The Facts
In 2010, a Spanish national, Mario Costeja González, lodged a complaint against Google Inc., Google Spain, and La Vanguardia Ediciones SL, the publisher of a daily newspaper, with the Agencia Española de Protección de Datos (AEPD), the Spanish Data Protection Agency. The complaint concerned information about Mr. Costeja González, including his name, that had been published in the Spanish newspaper. The information related to a real estate auction from 1998 held to pay off some of his outstanding debts. In his complaint, Mr. Costeja González requested that the newspaper be ordered to remove or alter the pages in question (so that the personal data relating to him no longer appeared) or to use certain tools made available by search engines to protect the data. He also requested that Google be required to remove the link to the pages from its index so that it would not be returned as a search result. Mr. Costeja González did not dispute the accuracy of the information published—he merely argued that the information concerned an issue that was resolved and now irrelevant.
The AEPD rejected the complaint against La Vanguardia, taking the view that the information in question had been lawfully published. It upheld, however, the complaint against Google Spain and Google Inc. and ordered that the links in question be removed from their search results. The AEPD reasoned that as a “data processor” under the 1995 Data Protection Directive (Directive), Google had a greater obligation to protect an individual's right to privacy. Google appealed this decision before the Spanish High Court, which referred a series of questions to the ECJ for a preliminary ruling, including: (1) whether the activities of Google Inc. and Google Spain brought the search engine within the territorial scope of the Directive; (2) whether the activity of the search engine in collecting, caching, indexing, and retrieving data constituted “processing” under the Directive, for which the search engine would be the data controller; and (3) whether an individual could invoke rights under the Directive to seek erasure of personal data or object to the processing of personal data in order to have it removed.
The ECJ's Ruling: Google Must Forget
The ECJ issued its judgment on May 13, 2014, finding that, under the 1995 Directive: (1) processing personal data “in the context of the activities of an establishment” of a data controller on the territory of an EU member state is subject to EU jurisdiction; (2) the activity of a search engine constitutes processing of personal data, and the operator of a search engine is a “controller” of that data; and (3) individuals have a right to be forgotten. The ECJ's ruling is in stark contrast to the non-binding Opinion of the ECJ's Advocate General, which is meant to advise the ECJ on new points of law. The Advocate General's Opinion sided with Google, finding that search engines are not responsible for personal information appearing on web pages they process and that “the Directive does not provide for a general right to be forgotten in the sense that a data subject is entitled to restrict or terminate dissemination of personal data that he considers to be harmful or contrary to his interests.”
Scope of the Data Protection Directive
The ECJ first considered whether Google's activities fell within the territorial scope of the 1995 Data Protection Directive. For the Directive to apply, Article 4(1)(a) requires that “the processing [of personal data] is carried out in the context of the activities of an establishment of the controller.” Google argued that because the search engine is operated by Google Inc., which is established in the U.S., it does not fall under the territorial scope of the Directive. Google further argued that because Google Spain is a subsidiary company whose operations are limited to selling advertising space, it is not directly involved in the processing of personal data which occurs in connection with the operation of the Google search engine.
The ECJ rejected these arguments and held that Google Spain constituted an “establishment” for the purposes of Article 4(1)(a). The court reasoned that the search engine operations of Google Inc. and advertising operations of Google Spain were inextricably linked because Google Spain's activities “constitute the means of rendering the search engine at issue economically profitable” and because “that [search] engine is, at the same time, the means enabling [Google Spain's advertising] activities to be performed.” Based on the ECJ's ruling, where a search engine operator located outside the EU has subsidiaries in one or more EU member states, those subsidiaries promote and sell advertising space offered by that search engine, and the search engine directs its activities towards the inhabitants of those Member States, then that search engine is “established” in those EU member states for purposes of Article 4(1)(a).
Search Engines as “Data Controllers”
The Court next considered whether Google was a “data controller” under the Directive. Article 2(b) of the Directive defines “processing of personal data” to include “collection, recording, organization, storage, adaptation or alteration, retrieval, consultation, use, disclosure by transmission, dissemination or otherwise making available, alignment or combination, blocking, erasure or destruction” of personal data. Where the Directive covers “personal data processing,” the legal or natural person who determines the purposes and means of such processing is the “data controller.”
Google argued that it did not determine the purposes and means of the processing of personal data, and, as such, it was not a data controller for purposes of the Directive. The ECJ disagreed, emphasizing that a broad definition must be given to “data controller” to ensure appropriate protection for data subjects. According to the ECJ, the actions of search engines in collecting information from the Internet, storing and indexing that information, and displaying that information in search results constitutes “processing” of personal data under the Directive. The court further held that the processing carried out by Google was a separate and additional activity to the processing carried out by the original publisher. The ECJ found that the search engine operator is a “data controller” within the meaning of the Directive with respect to such processing.
In making this determination, the ECJ looked not only to what Google actually did in terms of its search engine operations, but also to the impact of the search engine on linking individuals to results. Paragraph 38 of the ECJ's ruling explains as follows:
“[A]s the activity of a search engine is therefore liable to affect significantly, and additionally compared with that of the publishers of websites, the fundamental rights of privacy and to the protection of personal data, the operator of the search engine as the person determining the purposes and means of that activity must ensure, within the framework of its responsibilities, powers and capabilities, that the activity meets the requirements of Directive 95/46 in order that the guarantees laid down by the directive may have full effect and that effective and complete protection of data subjects, in particular of their right to privacy, may actually be achieved.”
The Right to Be Forgotten
Having found that Google acted as a data controller for purposes of the Directive, the ECJ determined that Google had duty to comply with the Directive and any laws that implement it in Member States. This includes Articles 12(b) and 14 of the Directive, which, according to the ECJ, provide data subjects with a right to be forgotten.
Under Article 12(b) of the Directive, a data subject may obtain the rectification, erasure, or blocking of data that does not comply with provisions of the Directive, including Article 6. Article 6 requires, in relevant part, that data is relevant, accurate, up-to-date, and retained for no longer than is necessary.
Article 14, in turn, provides data subjects the right to object “on compelling legitimate grounds” to the processing of their personal data. According to the ECJ, such requests require a balancing exercise between the Article 7 legitimate interests of Internet users to access data and the data subject's privacy rights. In conducting that balancing test, the ECJ made the sweeping statement that the rights of data subjects override the interests of the public in having access to information, except in certain limited circumstances.
To further support a “right to be forgotten,” the ECJ makes frequent reference to Articles 7 and 8 of the EU Charter of Fundamental Rights, which address the right to respect for private life and the protection of personal data. What is strikingly absent from the ruling, however, is any reference to Article 11 of the Charter, which addresses freedom of expression.
What Happens Now?
The ECJ ruling is final and cannot be appealed. Google already has announced a basic framework to comply with the ruling. It remains to be seen, however, how Google will determine which links violate an individual's privacy and which links should remain available to the public. While the ECJ indicated that some balancing of individual privacy and the rights of Internet users is appropriate, it offered little practical guidance on how or when to strike that balance. The Article 29 Data Protection Working Party, a group of representatives from the data protection authority in each EU member state, will meet in the coming weeks to discuss how the ruling will be enforced and how the right to be forgotten should be implemented.
In the meantime, Google has moved forward with its initial plan to comply with the ECJ's ruling. On May 30, 2014, Google posted a new online form that enables EU citizens to request the removal of links from search results that include that individual's name. Individuals using the online form must provide photo identification as well as an explanation of why the search results are irrelevant, outdated, or otherwise inappropriate. As of June 3, 2014, Google reported that it had received 41,000 requests to remove links—roughly seven a minute.
Google has stated that it will manually review each submission, and its decision on whether or not to remove a link will be based on whether the information was perceived to be out of date or if links to people's past activities were of public interest. Where Google agrees to remove a link, it will do so only within the 28 member states of the EU plus Norway, Iceland, Switzerland, and Liechtenstein. As such, the removal would take effect in the EU member country's domain, but not, for example, in the United States. This could result in different levels of quality for search results depending on where the search is conducted and, potentially, a Balkanization of search results. Where an individual's request to remove a link is denied, that individual would then have the right to take his or her objection to the national data protection authority.
Google also has announced the creation of an advisory panel of privacy experts, regulators, academics, and company executives to analyze and offer recommendations on the right to be forgotten and its operations going forward. Whatever the ultimate form of compliance looks like, the ECJ decision creates significant difficulties for the global presentation of online information by international companies.