How far does “the right to be forgotten” extend?

United Kingdom

This article was produced by Olswang LLP, which joined with CMS on 1 May 2017.

Perhaps the most critical question for Google's lawyers when receiving a deluge of new take-down/ blocking requests will be when the data processing complained of is unlawful within the EU data protection regime and when Google has the requisite knowledge of such unlawful processing.

The processing of the data by Google search (not necessarily the original publisher) must be contrary to the data protection principles established by the DPD, or more accurately the legislation that implements the DPD in the relevant member state of the European Union. So, for example, a UK citizen would need to demonstrate a breach of the Data Protection Act 1998 by reference to the data protection principles set out in schedule 1 of the Act.

Each European Member State has implemented its own data protection laws based on the DPD.

There are significant variations and each Member State has developed its own body of case law and regulatory decisions as to what is unlawful data processing, but all national implementing legislation and that case law will now need to be seen in light of the guidance provided by the CJEU in the Google Spain decision.

But what exactly is that threshold for the level of sensitivity of the personal data before its processing by a search engine is unlawful? Many observers, particularly those steeped in First Amendment law in the United States, would be forgiven for thinking there is barely any threshold at all. But a closer look at the CJEU's reasoning is required.

The key phrase in the judgment is at paragraph 92, which provides the rather limited guidance (largely parroting the DPD) that the data will be unlawfully processed when it is:

  • "inadequate,irrelevantorexcessivein relation to the purposes of the processing";
  • "not keptup to date", or
  • "kept forlonger than is necessaryunless they are required to be kept for historical, statistical or scientific purposes".

For a landmark judgment, it is hard to imagine a more opaque set of guiding principles. In the context of search engines, a whole barrage of questions emerge:

1. When is data irrelevant and by what standard is relevance judged? Is this some kind of public interest test or can it be relevant to a small group of people? Presumably, as per the judgment, relevance fades over time but how long does it take for historical records to become irrelevant?

When is the data "no longer" relevant? So far we have one data point: 16 years for a repossession notice is too old. But how can we apply these vague principles to other situations?

2. What does 'excessive' mean? Does it mean that ten Google search results containing the same personal data should be treated differently from a single search result? If data was true at the time of publication (for example, that an individual has a serious illness) does it become 'out of date' when the facts change (for example, the illness is cured)? Does its position in the Google search rankings matter? It is interesting that the CJEU pointed out the impact that a search result could have on an individual's privacy, noting that it can often give the information much more prominence than if it were merely left to the third-party website on which it is published (see below). One may take from this that its position in the search rankings would matter, but the court has not quite said as much.

3. When is data required for statistical purposes? Google does not exist for statistical purposes. It simply indexes data and statistics stored on other websites so would this exception be relevant to Google? How will such statistics be found if not through search engines?

We could go on and on.

Google's ability to collate data is also key. The CJEU stated at para 80:

"It must be pointed out at the outset that … processing of personal data, such as that at issue in the main proceedings, carried out by the operator of a search engine is liable to affect significantly the fundamental rights to privacy and to the protection of personal data when the search by means of that engine is carried out on the basis of an individual's name, since that processing enables any internet user to obtain through the list of results a structured overview of the information relating to that individual that can be found on the internet - information which potentially concerns a vast number of aspects of his private life and which, without the search engine, could not have been interconnected or could have been only with great difficulty - and thereby to establish a more or less detailed profile of him. Furthermore, the effect of the interference with those rights of the data subject is heightened on account of the important role played by the internet and search engines in modern society, which render the information contained in such a list of results ubiquitous."

Whilst unhelpful to Google, this passage will at least be seized upon by other website operators and platforms (and Google's other services) to argue that the decision applies only to search engines and their unique indexing qualities and coverage across the web. But we will have to wait and see just how narrowly it will be interpreted and whether websites with search functions will also be caught by these broad principles.

A more fundamental question being asked is whether it is right at all that Google and other search engines should have to take the role of judge and jury in determining the answers to these questions.

In this respect, it is important to consider how the threshold for 'unlawful data processing' sits alongside the safe-harbour 'mere conduit', 'caching' and 'hosting' defences provided for by Articles 12, 13, and 14 of the E-Commerce Directive.

In the case ofMetropolitan Schools v Google2in the UK, Google was classified as a 'mere conduit' in relation to its search results, even having been notified of unlawful (libellous) content being returned, and so was not liable for it. This decision was not too far behind the protection that would be afforded to Google in the US by virtue of section 230 of the Communication Decency Act 2000.

However, followingGoogle Spain, regardless of that decision in relation to Google as an intermediary search engine it can now be liable as a data controller - effectively as a primary publisher.

What we can be sure of is that individuals will be looking at their Google take-down requests based on grounds of defamation, malicious falsehood, copyright, passing off, breach of confidence, and breach of privacy to see whether they can be recast as claims for unlawful data processing. And that will be happening in 28 European jurisdictions.