Practical Obscurity and The Right to Be Forgotten: “Pretty Much” Privacy is Enough
In 1979, the U.S. Supreme Court recognized an individual interest in the “practical obscurity” of certain personal information. The case was DOJ v. Reporters Committee for a Free Press. The Court held that publically available FBI rap sheets – maintained on tens of millions of people — could be withheld under FOIA because, while they were not private, they were not easily available. The decision was decried by free speech advocates because it seemed to expand the privacy exemption to Freedom of Information disclosure. But the principle of the case today seems useful to both free speech and privacy advocates, and particularly useful in the Right to be Forgotten context.
In May 2014, the European Court of Justice ruled that Europeans have the right to request that Google and other search engines remove links to search results on their names that is not relevant, no longer relevant, inaccurate, or excessive. In responding to these requests, the search engine should balance individual privacy interests with the public interest in access to information. Google has at this point reviewed more than 750,000 links requested to be removed, and has removed close to 60%. But Google has only tinkered with searches conducted on European domains such as .uk, .fr, .de. The .com domain results remain unaltered.
In November 2014 guidance, a European Union governmental working group recommended that the removal of links extend to all domains. The New York Times today editorialized that such extra-territorial reach of the EU ruling goes too far. Not surprisingly, a Google-appointed advisory group agrees. The U.S. has different free speech values and strikes different balances between speech, on the one hand, and privacy or dignitary rights on the other. That’s why Yahoo blocks sites that sell Nazi memorabilia in France, but not in the U.S. The fear of EU over-reaching is that an aggrieved businessman in Milan can control what an interested citizen or journalist sees in New York. More than that, it’s the fear that EU over-reaching will spread to Pakistan and Russia, and that these countries’ strategies for suppressing information are exported to the global Internet.
On the other hand, the EU is surely right that blocking links only on European domains is weak protection. The “disappeared” information is easily found a click away by searching through a .com domain. The question really is whether a person’s “right to be forgotten” is a right to completely erase information deemed unjustifiably harmful (by a private company without public oversight… but that’s a different issue). Or is it the person’s right to return to the “practical obscurity” that existed before her reputation was defined by a Google search. 95% of Google searches conducted in Europe go through European domains. And a high proportion of the remaining 5% are conducted by travelers and ex-pats. So blocking access from European domains pretty much does the job. It protects the person who did something stupid or was associated with the wrong people from being “that guy” to the casual searcher. “Pretty much” is all that the right to be forgotten should protect. It should protect practical obscurity.
In another post, I’ll explore the implications of practical obscurity for the third-party doctrine and the evolution of privacy law. It provides a promising middle ground between private and public, allowing for a more context-sensitive treatment of personal information.