Oxford University Press's
Academic Insights for the Thinking World

The power of the algorithm

Recently Google Inc. was ordered to remove nine search results after the Information Commissioner’s Office (ICO) ruled that they linked to information about a person that was no longer relevant. Almost ten years ago, that individual had committed a minor criminal offence and he recently put a request to Google that related search results be removed, in compliance with the decision of the European Court of Justice relating to Google Spain. Google removed the links and this became a news story. Journalistic content including the individual’s name and details of the original criminal offence started featuring in search results. It was this second set of search results that the ICO ordered Google to remove, despite their claim that these articles were essential to a recent news story on a matter of significant public importance.

In line with the European Court, the ICO confirmed that links prompted by searching on an individual’s name are ‘data processing’ for the purpose of data protection law, and a person has a right to request the removal from the search results of information about themselves that is no longer relevant. In Google Spain, the scope that the European Court gave to the right to erase personal information is remarkably broad. It covers instances where personal information has been initially made available on the Internet legitimately, or where the information is true, as well as instances where the inclusion of the information in search results does not cause any prejudice to the person.

One of the most significant contributions of this case was that, to an extent, it introduced a right of informational self-determination in that individuals can have an active role in the determination of the way in which their personal information is presented online. This exceeds the scope of informational privacy, and the narrowly defined ‘right to be forgotten’ as featuring in the proposed Data Protection Regulation.

How far does this new right go? Delisting search results is only one of many functions for which search engines have been called to take responsibility. However, their obligation to remove personal information also extends to other services provided in the course of the searching process. One of those services, perhaps the most influential one, is the autocomplete function. This is the algorithm that ‘predicts’ the words you would enter in the search bar while you are still typing, and suggests possible word-combinations. Although the algorithm obviously makes searches easier and faster, it has become source of litigation in various jurisdictions in cases where automated suggestions appeared to convey a deceptive or defamatory meaning, or to prompt the search in ‘wrong’ directions.

In Germany, for instance, defamatory autocomplete suggestions should be removed upon request, after a man won a lawsuit against Google for his name being associated with the words ‘scientology’ and ‘fraud’ in Google’s autocomplete function, claiming these were defamatory insinuations. In its decision, which was issued a year before Google Spain, the Federal Supreme Court found that the autocomplete function creates an expectation to Internet users. Because the algorithm-driven search program incorporates the searches previously made by other users, and presents current users with the combination that is most frequently appearing in search results, users expect that the suggestions are informed by real content and hence have an informative meaning. This is true, despite the fact that there may not be search results confirming the suggestion.

The Google Spain decision expands the scope of requests for removal of autocomplete suggestions both in a procedural and substantial way. All major search engine operators have already put in place a ‘notice and take down’ procedure, similar to the one adopted in some jurisdictions for claims on copyright infringement, to deal with requests of removal of search results. Google has recently included, for users browsing from the EU, a separate procedure to request the exclusion of ‘offensive predictions’ made by its autocomplete function.

From a substantial point of view, the Google Spain decision has extended the ground to request removal to instances where autocomplete suggestions do not even convey a defamatory meaning or false factual information. Applications for injunctive relief regarding the misuse of personal information may now be successful. Readers may remember the case of Max Mosley, the former head of Formula One involved in a British tabloid sex scandal in 2008. Mr Mosley’s application for an injunction against News Group Newspapers was refused in 2008, because the relevant material had become so widely accessible that there was no longer a reasonable expectation of privacy in relation to it, or it would make little practical difference to accept the application. Google Spain gave Mr Mosley a new opportunity to seek the removal of those photographs from search results (as well as the removal of the results from autocomplete suggestions) on the basis of ‘a viable claim which raises questions of general public interest, which ought to proceed to trial’.

Whereas it is true that censoring search results can be dangerous and is for various reasons unwelcome, there are instances where purely algorithmic operations may violate the rights of individuals. On one side, there are arguments in favour of the protection of the fundamental freedoms of speech and uncensored access to information online. On the other side, personality rights on the Internet ought to be preserved, especially where their processing is disproportionate to the legitimate aim of facilitating the search of information online.

Conflicts of rights in this context shed new light on the seemingly settled issue of search engine liability. Search engines are the less and less neutral carriers of information and may act as publishers, as content providers and as data controllers. In their role as primary information gatekeepers of the web, they have an enlarged duty to comply with individuals’ requests to participate in the way in which information about themselves is organized and presented.

Featured image: ‘BalticServers data center. Photo by BalticServers.com (Fleshas). CC BY-SA 3.0 via Wikimedia Commons.

Recent Comments

There are currently no comments.