Google Spain v AEPD and Mario Costeja González

From Infogalactic: the planetary knowledge core
(Redirected from Google v González)
Jump to: navigation, search
Google Spain v AEPD and Mario Costeja González
European stars.svg
Submitted 27 February 2012
Decided 13 May 2014
Full case name Google Spain SL, Google Inc. v es (Agencia Española de Protección de Datos), Mario Costeja González
Case number C-131/12
ECLI ECLI:EU:C:2014:317
Case Type Reference for a preliminary ruling
Chamber Full chamber
Nationality of parties Spanish
Procedural history Reference of the Audiencia Nacional (Spain)
Ruling
An Internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties. An individual may request hyperlinks to be removed from the search engine's index.
Court composition
Legislation affecting
Interprets Directive 95/46/EC
Keywords
  • Personal data - Protection of individuals with regard to the processing of such data
  • Directive 95/46/EC - Articles 2, 4, 12 and 14 - Material and territorial scope
  • Internet search engines - Processing of data contained on websites - Searching for, indexing and storage of such data - Responsibility of the operator of the search engine - Establishment on the territory of a Member State - Extent of that operator’s obligations and of the data subject’s rights
  • Charter of Fundamental Rights of the European Union - Articles 7 and 8.

Google Spain SL, Google Inc. v Agencia Española de Protección de Datos, Mario Costeja González (2014) is a decision by the Court of Justice of the European Union (CJEU). It held that an Internet search engine operator is responsible for the processing that it carries out of personal information which appears on web pages published by third parties.[1][2][3][4]

The outcome of the ruling is that an Internet search engine must consider requests from individuals to remove links to freely accessible web pages resulting from a search on their name. Grounds for removal include cases where the search result(s) "appear to be inadequate, irrelevant or no longer relevant or excessive in the light of the time that had elapsed."[5]:92 If the search engine rejects the request, the individual may ask relevant authorities to consider the case. Under certain conditions, the search engine may be ordered to remove the links from search results.

The decision confirms a so-called right to be forgotten mooted in the proposed General Data Protection Regulation, due to take effect in late 2014, although the Court did not explicitly grant such a right, depending instead on the data subject's rights deriving from Article 7 (respect for private and family life) and Article 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union.[6]

Facts

In 1998 the Spanish newspaper La Vanguardia published two announcements in its printed edition regarding the forced sale of properties arising from social security debts. The announcements were published on the order of the Spanish Ministry of Labour and Social Affairs and their purpose was to attract as many bidders as possible. A version of the edition was later made available on the web.[7][8]

One of the properties described in the newspaper announcements belonged to Mario Costeja González, who was named in the announcements. In November 2009, Costeja contacted the newspaper to complain that when his name was entered in the Google search engine it led to the announcements. He asked that the data relating to him be removed, arguing that the forced sale had been concluded years before and was no longer relevant. The newspaper replied that erasing his data was not appropriate since the publication had been on the order of the Spanish Ministry of Labour and Social Affairs.[9][10][11]

Costeja then contacted Google Spain in February 2010, asking that the links to the announcements be removed. Google Spain forwarded the request to Google Inc., whose registered office is in California, United States, taking the view that this latter was the responsible body. Costeja subsequently lodged a complaint with the Spanish Data Protection Agency (Agencia Española de Protección de Datos, AEPD) asking both that the newspaper be required to remove the data and that Google Spain or Google Inc. be required to remove the links to the data. On 30 July 2010, the Director of APED rejected the complaint against the newspaper but upheld the complaint against Google Spain and Google Inc., calling on them to remove the links complained of and make access to the data impossible.[12]

Google Spain and Google Inc. subsequently brought separate actions against the decision before the Audiencia Nacional (National High Court of Spain). Their appeal was based on:[12]

  1. Google Inc. was not within the scope of the EU Directive 95/46/EC (Data Protection Directive) and its subsidiary Google Spain was not responsible for the search engine
  2. there was no processing of personal data within the search function
  3. even were there processing, neither Google Inc. nor Google Spain could be regarded as a data controller
  4. in any event, the data subject (Costeja) did not have the right to erasure of lawfully published material

The Audiencia Nacional joined the actions and stayed the proceedings pending a preliminary ruling from the CJEU on a number of questions regarding the interpretation of the Data Protection Directive. These questions fell into three groups. In essence they concerned:[8][9][11][12]

  1. the territorial scope of the Directive
  2. the legal position of an Internet search engine service provider under the Directive, especially in terms of the Directive's material scope and as to whether the search engine could be regarded as a data controller
  3. whether the Directive establishes a so-called right to be forgotten

All of these questions, also raising important points of fundamental rights protection, were new to the Court. Because new points of law were involved, the opinion of an Advocate General was sought by the Court.[12]

Procedure

Written proceedings followed by an oral hearing were held on 26 February 2013, at which, besides the parties, the governments of Austria, Greece, Italy, Spain and Poland and the European Commission gave their opinion. Advocate General Niilo Jääskinen gave his opinion on 25 June 2013, after which judgment was given on 13 May 2014.

Advocate General's Opinion

The purpose of an Advocate General's Opinion is to advise the Court on new points of law. It is not binding on the Court. In this case the Advocate General was Niilo Jääskinen from Finland.[12][13]

Advocate General Jääskinen made frequent reference in his Opinion to the fact that the Data Protection Directive predates the Google era (it dates from 1995 and is due to be replaced in late 2014 by the General Data Protection Regulation). On the first set of questions the Advocate General found that Google's business model brought Google Inc. and Google Spain within the scope of the Directive.[14]:64–68 On the second set of questions concerning the material scope of the Directive, the Advocate General held that Google could not be regarded as a data controller: Google's search activities involves the processing of personal data, but Google does not thereby become a data controller for the content of the material when the processing is carried out in a haphazard, indiscriminate and random manner. In the Advocate General's view the sense of the Directive is that "the controller is aware of the existence of a certain defined category of information amounting to personal data and the controller processes this data with some intention which relates to their processing as personal data".[14]:B 76–83

In the event that the Court did not agree with his finding that Google is not a data controller, the Advocate General considered the third set of questions relating to a right to be forgotten. He held that the rights of freedom of information and expression took precedence over any such right to erasure, and urged the Court not to allow case-by-case resolution of such conflicts as that would likely lead to the "automatic withdrawal of links to any objected contents or to an unmanageable number of requests handled by the most popular and important Internet search engine service providers."[14]:133

Judgment

The Court of Justice of the European Union ruled that an Internet search engine operator is responsible for the processing that it carries out of personal data which appear on web pages published by third parties, upholding a right of erasure.[2][8][15]

The Court considered the material scope of the Directive first. The Court rejected Google's submission, supported by the Advocate General, that it could not be regarded as a data controller within the scope of the Data Protection Directive, adopting a literal interpretation of article 2(b), giving definitions, of the Directive and relying on Lindqvist.[5]:21–41[16]:25

Regarding the territorial scope of the Directive, the Court observed that Google Spain is a subsidiary of Google Inc. on Spanish territory and, therefore, an 'establishment' within the meaning of the directive.[5]:48–49 The Court rejected Google Inc.'s argument that it was not carrying out its data processing in Spain, holding that the promotion and selling of advertising space by its subsidiary Google Spain was sufficient to constitute processing within the meaning of the Directive.[5]:50–57 To have ruled otherwise would have been to undermine the effectiveness of the Directive and the fundamental rights and freedoms of individuals that the Directive seeks to ensure.[5]:58 The Court thus endorsed the Advocate General's view that Google Inc. and Google Spain should be treated as a single economic unit.[14]:66–67

Concerning the obligations and duties of the operator of a search engine, the Court held that in the present case Article 7(f) of the Directive, relating to legitimacy of processing, requires a balancing of the opposing rights and interests of the data subject (González) and the data controller (Google), taking into account the data subject's rights deriving from Articles 7 (respect for private and family life) and 8 (protection of personal data) of the Charter of Fundamental Rights of the European Union.[5]:73–74 Article 14(a) of the Directive, relating to the data subject's rights, allows the data subject, at least in the cases covered by Articles 7(e) and 7(f), to object at any time on compelling legitimate grounds relating to his particular situation to the processing of data relating to him, save where otherwise provided by national legislation.[5]:76 Article 12(b) of the Directive, relating to the data subject's right of access to the data, allows the data subject to request erasure of the data. Such request may be made directly of the controller, who must then duly examine the merits of the request. If the request is not granted, the data subject may then direct the request to a supervisory authority or the judicial authority so that it carries out the necessary checks and orders the controller to take specific measures accordingly.[5]:77

Regarding the question relating to the so-called right to be forgotten, the Court noted that Google Spain, Google Inc., the Greek, Austrian and Polish Governments and the Commission considered that this question should be answered in the negative.[5]:90 The Court held, however, that the processing of data which is "inadequate, irrelevant or excessive" (i.e. not merely inaccurate) might also be incompatible with the Directive.[5]:92 In such cases, where the data is incompatible with the provisions of article 6(1)(e) to (f) of the Directive, relating to data quality, the information and links in the list of the results must be erased.[5]:94 It is not necessary that the information is prejudicial to the data subject.[5]:96

Significance

The ruling balances the right to privacy and data protection in European law with the legitimate interest of the public to access such information, and it does not mandate that information is instantly removed upon request. It distinguishes between public figures and private persons. The Court stressed that Internet search engines profile individuals in modern society in an ubiquitous manner, in a way that could not otherwise have been obtained formerly save only with the greatest difficulty.[5]:80 The data subject's rights must therefore in general override "as a rule, not only the economic interest of the operator of the search engine but also the interest of the general public in finding that information upon a search relating to the data subject’s name", but that would not be the case if the role played by the data subject in public life is such "that the interference with his fundamental rights is justified by the preponderant interest of the general public in having, on account of inclusion in the list of results, access to the information in question".[5]:97

Google subsequently published an online form which can be used by EU citizens or EFTA nationals to request the removal of links from its search results if the data linked is "inadequate, irrelevant or no longer relevant, or excessive in relation to the purposes for which they were processed".[17][18] On 31 May 2014, the first day of the service, Google received over 12,000 requests from people asking the company to remove certain links about them from its search results.[19]

On 19 June 2015, Google announced it would remove links to nonconsensual pornography ("revenge porn") on request. Commentators noted that this was not the same thing as implementing a "right to be forgotten" as the company already has policies in place dealing with sensitive personal data such as social security numbers and credit card numbers.[20] However, the consumer advocacy group Consumer Watchdog subsequently called on Google to extend the right to be forgotten to U.S. users, filing a complaint with the Federal Trade Commission.[21]

Commentary

  • In a Guardian piece, Julia Powles remarks that the ruling provides an essential platform for public debate as the European Commission considers reform of the Directive in its upcoming General Data Protection Regulation.[8][22]
  • Guy Vassall-Adams of Matrix Chambers characterizes the judgment as profoundly harmful to the operation of the Internet and a betrayal of Europe's legacy in protecting freedom of expression:[9]<templatestyles src="Template:Blockquote/styles.css" />

    The court’s reductionist approach is to require that all published information must have a specific public interest justification. This approach is profoundly erroneous and stems in large part from failing to keep in mind the private/public distinction. Most of what is published on the internet has no specific public interest justification and there is no specific public interest which could relate to most pieces of biographical information about an individual. Facebook is an extremely valuable resource for freedom of expression and information sharing, but most of the “personal data” published there be it banal or wacky would not avail itself of any specific public interest defence. The point is that it shouldn’t have to; there is an inherent value in the free circulation of information and ideas which the court has completely overlooked.

    — Guy Vassall-Adams, Matrix Chambers
  • German Professor Niko Härting wrote a summary of the worries about undervaluing the importance of freedom of information and communication, as well as the dangers of abuse of such a system, stemming from this decision;[23] <templatestyles src="Template:Blockquote/styles.css" />

    “Privacy by default” will encourage politicians, celebrities and other public figures to put their lawyers on track when they find inconvenient information online. And as the use of a search engine like Google is essential for finding information, the elimination from the results of search engines will provide a convenient and essential tool to suppress information.

  • A piece on TheUndisciplined.com drew attention to the fact that Google's classification as a "data controller" could be appropriate regarding Google's collection and use of aggregated and personal data for advertising and other commercial purposes, but that the information presented by an automated service in the form of results from a search engine might not be as easily designated as "controlled" or "processed" data;[24] <templatestyles src="Template:Blockquote/styles.css" />

    [...]the problems which really result from the court’s ruling stem from the designation of Google and other search engines as “data controllers”, as this is setting the threshold for “processing” or “controlling” of information rather low. Without any active “processing” of the information, beyond simply how it interacts with the automatic systems behind the search engine, it is hard to see how companies such as Google should be expected to exercise quality control over such information. It is also very important to keep in mind that Google’s formula for generating search results is a totally different topic from Google Adwords and Google’s processing of actual personal data for advertising reasons. That sort of data processing is only used for targeted ads and the like, and it is not this “data controlling” about which we talk when we discuss search results from Google’s search engine. Popular dissatisfaction with companies like Google who collect and benefit from the use of personal data and preferences might have lead some to confuse Google’s data gathering for advertising practices with the workings of the algorithms used to automatically generate search results.

  • The Harvard Law Review (HLR) examines criticism that the court got it wrong. HLR says these critics are ignoring that the Court made a reasonable interpretation of the Directive's text and the privacy values it contains. Their failure to fully engage with these values and to acknowledge that the measure enjoys widespread support across the EU undermine their contribution to the policy debate.[25]

Implementation

The EU's Article 29 Data Protection Working Party issued its guidelines on how the ruling should be implemented on 26 November 2014.[26][27]

Google published its advisory committee's report on how the ruling should be implemented on 5 February 2015. Their advisory committee includes Luciano Floridi, Professor of Philosophy and Ethics of Information at the University of Oxford, Frank La Rue, who served as UN Special Rapporteur on the Promotion and Protection of the Right to Freedom of Opinion and Expression, from August 2008 to August 2014, and Jimmy Wales, a founder of Wikipedia. The committee recommended that only national links, not global links, should be removed, and that publishers should be notified of the removals and have the right to challenge them.[28][29]

On 7 February 2015, The Times reported that the chairwoman of the Article 29 Working Party, fr (Isabelle Falque-Pierrotin), has warned Google that it faces legal action if it fails to heed its warnings over the way Google is policing the ruling. The working party wants Google to stop notifying publishers and to remove its links globally.[30]

See also

References and sources

References
  1. Lua error in package.lua at line 80: module 'strict' not found.
  2. 2.0 2.1 Lua error in package.lua at line 80: module 'strict' not found.
  3. Lua error in package.lua at line 80: module 'strict' not found.
  4. Lua error in package.lua at line 80: module 'strict' not found.
  5. 5.00 5.01 5.02 5.03 5.04 5.05 5.06 5.07 5.08 5.09 5.10 5.11 5.12 5.13 Lua error in package.lua at line 80: module 'strict' not found.
  6. Lua error in package.lua at line 80: module 'strict' not found.
  7. The announcements in question: Lua error in package.lua at line 80: module 'strict' not found. (in Catalan), accessed 16 May 2014. Translation: "Property auctions....the two undivided halves of a house at 8 Montseny St., owned by Mario Costeja González and Alicia Vargas Cots....."
  8. 8.0 8.1 8.2 8.3 Lua error in package.lua at line 80: module 'strict' not found.
  9. 9.0 9.1 9.2 Lua error in package.lua at line 80: module 'strict' not found.
  10. Lua error in package.lua at line 80: module 'strict' not found.
  11. 11.0 11.1 Lua error in package.lua at line 80: module 'strict' not found.
  12. 12.0 12.1 12.2 12.3 12.4 Lua error in package.lua at line 80: module 'strict' not found.
  13. Lua error in package.lua at line 80: module 'strict' not found.
  14. 14.0 14.1 14.2 14.3 Lua error in package.lua at line 80: module 'strict' not found.
  15. Lua error in package.lua at line 80: module 'strict' not found.
  16. Lua error in package.lua at line 80: module 'strict' not found.
  17. Google sets up 'right to be forgotten' form after EU ruling BBC News, 30 May 2014. Retrieved 30 May 2014.
  18. Lua error in package.lua at line 80: module 'strict' not found.
  19. Lua error in package.lua at line 80: module 'strict' not found.
  20. Lua error in package.lua at line 80: module 'strict' not found.
  21. Lua error in package.lua at line 80: module 'strict' not found.
  22. Lua error in package.lua at line 80: module 'strict' not found.
  23. Lua error in package.lua at line 80: module 'strict' not found.
  24. Lua error in package.lua at line 80: module 'strict' not found.
  25. Lua error in package.lua at line 80: module 'strict' not found.
  26. Lua error in package.lua at line 80: module 'strict' not found.
  27. Lua error in package.lua at line 80: module 'strict' not found.
  28. Lua error in package.lua at line 80: module 'strict' not found.
  29. Lua error in package.lua at line 80: module 'strict' not found.
  30. Lua error in package.lua at line 80: module 'strict' not found.
Sources
  • Lua error in package.lua at line 80: module 'strict' not found.

External links