Poor decision: Europe's internet gagging order

The quickest leap to censorship is through the erasure of information from search engines, says Paul Stanley QC
In 1998, a Spanish newspaper published an announcement about an auction of some land to settle social security debts. The ad mentioned Mr Costeja González and, with the rest
of the paper’s content, it appeared online. This, along with most of the internet,
was indexed by Google.
González recently sought orders from Spanish authorities requiring Google to ensure
that its search results would
not return links to the articles
in question. The decision concerned the interpretation
of the Data Protection Directive, 95/46/EC.
There were three issues,
but one particular decision is proving to be controversial.
The first question was whether a search engine operator is a ‘data controller’ engaged in ‘data processing’. Google argued that its activity did not involve processing, as it did not select the data it stored but simply summarised and indexed material that others had put online.
It is not surprising that
this argument was rejected. Nothing in the directive makes its operation dependent on
the exercise of any sort
of independent judgment on the part of the processor.
And given that a website can clearly be a source of personal data, the activity of organising this and making it accessible seems self-evidently to be a species of processing, by Google as ‘data controller’.
Google wanted, of course,
to suggest that it had no active role in this, that it was the mere servant of those who produced content on the internet (and could, by various technical means, prevent it being indexed if they chose to do so).
It is no surprise that the
Court of Justice of the European Union rejected this suggestion, given the central, and profitable, role played by Google in relation to internet publication as a whole.
The real surprise is that Advocate General Jääskinen had been willing to adopt Google’s narrow construction.
International organisation
The second issue was whether the directive applied to Google, given the fact that it is an international organisation.
The CJEU had little difficulty in concluding that Google maintained, through a Spanish subsidiary, an ‘establishment’
in Spain. This view, which
had been shared by the Advocate General and was supported by a wide array
of member states, and by
the European Commission,
is not surprising either.
This left the third issue, of great importance. How far
does the directive impose any obligation on a search engine service to implement a ‘right
to be forgotten’?
The question is especially serious where, as was the case here, the original material that had been indexed was lawfully published and accurate, and was to remain online.
Nevertheless, the CJEU
held that in appropriate circumstances an individual could apply to the competent authorities for an order effecting the erasure of information by a search
engine, in the interests of
the data-subject’s right to control personal information.
The most disappointing
thing about this aspect of the judgment is that the court identified the key problem
but did not grapple with it.
Publishing information
on the internet may well be protected by journalistic freedom of expression. If
so, by what process is the indexing of that information
not also protected?
The CJEU seemed to think that there was no expressive
act involved. But that is wrong: disseminating information in accessible form (including accurate information about what others have accurately reported or recorded) is just
the kind of act that freedom of expression protects.
As anyone who has used a large library knows, a book without its index entry is as good as lost. In the enormous library that is the internet, there is no need to burn books. The shortest cut to effective censorship is not enrolment on an index prohibitorum, but erasure from search engines.
There is no question, then, that in terms of its reasoning this is a poor decision.
One might speculate that Google brought the problem on itself, by downplaying its role in an effort to avoid classification as a data controller.
If so, it is a great pity that the pursuit of one bad argument has led to the adoption
of another. SJ
Paul Stanley QC practises from Essex Court Chambers