Europe’s highest court just delivered a ruling that should brighten the day of not just Google—its immediate beneficiary—but Internet users around the world.
The court said Google does not have to delete results around the world when people in Europe want to hide information about themselves. Instead, Google only needs to remove those listings in the European Union, where the law gives people that right—as long as there’s not a good reason for keeping the information up, such as that person’s fame or the public interest.
As Google had argued, deciding the other way would have triggered a “global race to the bottom,” where any country can demand Google censor results globally because of its local laws.
China, for example, could have stopped people everywhere from seeing information about the Tiananmen Square massacre, even though those people’s own local laws are supposed to protect their right to see information freely.
The EU’s so-called “right to be forgotten” law dates back to 2014, when the Court of Justice of the European Union (CJEU) ruled that a Spaniard named Mario Costeja González could tell Google to remove links to articles about his bankruptcy many years before.
Costeja’s argument was that the information was by that point irrelevant, but it was the first thing people saw if they searched for his name. Based on existing EU data protection law, the court decided that he could have those links delisted. The right to be forgotten has since been more explicitly codified in the EU’s latest privacy law, the General Data Protection Regulation.
But how far should that delisting extend? In 2015, France’s privacy watchdog, CNIL, decided that it should apply everywhere—it told Google that it had to remove some results not just from google.fr but from every version of its service, even the U.S.-focused google.com.
Its reason? Someone in Europe could still visit those versions of Google to find the forbidden results.
Google tried to mollify the regulator by saying it would hide the results from the eyes of visitors that it could tell were located in Europe, but CNIL said this wasn’t enough, as people can use services such as VPNs to obscure their locations. And so the case went up the judicial chain to the CJEU.
On Tuesday, the CJEU followed the advice of its top advisor by ruling in Google’s favor.
Yes, CNIL was right to say that a global delisting would ensure the full protection of Europeans’ privacy rights, the court said, but other countries don’t have the same laws on that subject. In the EU, privacy rights may usually outweigh the right to freedom of information, but the opposite is true in, say, the United States.
The CJEU pointed out that EU privacy law does not clearly say delisting should apply outside the EU as well as within its borders—and in any case there’s no legal mechanism that could be used to try getting other jurisdictions to play ball with such decisions.
However, the court also clarified that a right-to-be-forgotten request granted in one EU country must be applied across all the bloc’s member states, not just in that country.
This ruling is important for Internet users around the world not just because of the precedent it sets, but also because the right to be forgotten is very frequently exercised in the EU.
According to Google’s latest figures, since the right was established the search engine has delisted almost 1.3 million web addresses, while turning down requests to delist almost 1.6 million addresses. People in France, Germany and the U.K. are the most enthusiastic users of this facility.
The ratio of Google’s decisions to delist results rather than keep them up—45%—demonstrates how the company is forced to act like a sort of online court in Europe, constantly having to weigh up the competing rights to privacy and free information.
In a separate ruling on Tuesday, the CJEU ruled that search engine operators have to be particularly careful when displaying results that include sensitive information about people, such as their political opinions, sex lives and religious beliefs. It said such links should only be listed when some searches for the person’s name if doing so is “strictly necessary” for protecting freedom of information.
By David Meyer