Share this...

by Larry Magid
Last year the European Union’s highest court ruled that European citizens have a “right to be forgotten” when it comes to search links to unfavorable Internet posts that are harmful and irrelevant. The court ruled in favor of a Spanish citizen who complained that Google’s links to an old notice about his house being repossessed violated his privacy rights because the issue was now irrelevant. The court ruled that search engines must have a mechanism to allow people to request the removal of links to negative stories, even if the stories themselves remained online.
I can understand the angst of that Spanish citizen and I can think of an experience I had several years ago when I used Google to research the professional qualifications of a doctor whose search results turned up nothing negative about his medical skills but did reveal that he had been charged with sexual harassment by a student at a university where he taught part time. The article said that the university was planning on holding a hearing regarding the charges against him, but I could find nothing about how the matter had been resolved. I inquired and eventually learned that all charges were dropped. But, as far as the web was concerned, he may as well have been guilty since there were links to information about the accusation but nothing about the eventual outcome.
But, as much as I sympathize with people whose reputation has been unjustly soiled, I also worry about the chilling effect on free speech. In the Spanish case, the person actually had gone through a bankruptcy and the search links were to news stories about something that really took place. The fact that it happened years ago and had been resolved is certainly a justification for not holding it against the person but removing links to a news story doesn’t take away the historical record. What this person had — in my opinion — wasn’t so much the right to be forgotten but the right to be forgiven.
The vast majority of “forgotten” requests do come from ordinary people, according the data accidentally leaked by Google and obtained by news outlets. “Data shows 95 percent of (220,000) Google privacy requests are from citizens out to protect personal and private information — not criminals, politicians and public figures,” wrote the Guardian. Examples included “a woman whose name appeared in prominent news articles after her husband died,” and another “seeking removal of her address, and an individual who contracted HIV a decade ago,” according to the Guardian.
Still, I remain skeptical of the ruling. Even though the policy is being used to mostly benefit “ordinary people,” 5 percent does include public figures, politicians and criminals and represents 11,000 cases where the public may indeed have a compelling reason to know.
The European ruling does not apply to the United States but the non-profit advocacy group Consumer Watchdog has called upon the Federal Trade Commission to declare “Google’s failure to offer U.S. users the ability to request the removal of search engine links from their name to information that is inadequate, irrelevant, no longer relevant, or excessive” as an “unfair and deceptive practice.” In an interview, John Simpson, the director of the organization’s Privacy Project, called it “Privacy by obscurity,” arguing that it’s time to for the digital age to adopted the long standing process where “over time things that were no longer relevant, people just sort of forgot about.”
To illustrate its point, the organization held a press briefing that featured relatives of Nikki Catsouras, an 18 year-old who was killed in a car accident in 2006. As part of the investigation, the California Highway Patrol took pictures of the accident and of the girl’s mutilated body, which were improperly leaked by CHP employees and posted online. Consumer Watchdog argues that even if the offensive material remains online, Google and other search engines should be required to remove any links to those images. The CHP eventually agreed to a $2.4 million settlement with the family but the images continue to haunt them, family members said at the briefing.
Google recently announced that it will “honor requests from people to remove nude or sexually explicit images shared without their consent” and Consumer Watchdog wants Google to extend that policy to a much wider swath of material.
The Catsouras case along with those of individuals anxious to abolish links to what they consider to be irrelevant yet harmful content about themselves tug at my heart and remind me that free speech issues are almost always based on difficult cases. I feel sad for the Catsourus family and fully understand their desire to have links to images of Nikki’s body erased from search engines just as I do for those affected by irrelevant negative publicity or vile hate speech. But as much as I want to protect people from offensive or harmful images and speech, I also want to protect our precious First Amendment, which is the cornerstone of our democracy. As with many tech-related policy issues, there are no simple answers, but before we rush to suppress any form of speech, we must consider the broader ramifications and precedents and — if we must err — we should err on the side of free speech.


Share this...