by Larry Magid | This post first appeared in the Mercury News
The messy thing about our First Amendment is that it protects nearly all speech, not just pleasant speech or even accurate speech. There have been no Supreme Court cases about our right to sing in the shower or post cat videos.
What we fight over are issues around hardcore pornography, hate speech and now the right to give really bad advice about vaccines. But, with very few exceptions, these types of speech — however vile or even potentially dangerous, are protected under our First Amendment. As a reminder, that amendment applies to government, not the private sector. Companies and organizations are not bound by it.
There are all sorts of words that I’m legally permitted to utter that will never show up in this column, partially because of my own sense of propriety but also because the news organizations I write for have their own standards and the right to maintain them. And, while social media companies generally have looser standards than mainstream news outlets, they, too, have the right to restrict what can be said on their platforms.
Still, most social media organizations are committed to allowing for wide-ranging discussions during which people can share all sorts of views and ideas. They can draw a line anywhere they want, but they are reluctant to draw it in such a way that it prohibits people from sharing diverse opinions.
Vaccines are a particularly touchy subject. Nearly all medical doctors and health officials agree that vaccines can save lives and prevent outbreaks of communicable diseases. They not only greatly benefit the individual who is vaccinated, but people around them because preventing someone from contracting a contagious disease also prevents them from spreading it.
For example, the Centers for Disease Control has declared the measles-mumps-rubella shot (MMR shot) to be “very safe, and it is effective.” And despite plenty of posts to the contrary, the agency said that “scientists in the United States and other countries have carefully studied the MMR shot. None has found a link between autism and the MMR shot.”
But just because health officials around the world have weighed in with scientific evidence, it doesn’t stop some people from believing and spreading false information. And, if you asked some of these people, I suspect they would argue their information is true and that the officials, doctors and scientists are wrong.
Dilemma for Facebook
This issue presents a dilemma for Facebook. The company wants its service to be a place where people can express opinions but not a place for spreading false information. Sometimes the designation between truth and fiction is undeniable, like the existence of gravity. But sometimes it’s a matter of opinion, even though it’s possible for opinions to be based on false information or partial information.
For example, some people believe that climate change is a hoax even though the preponderance of evidence points in the other direction. Should climate deniers be banned from expressing their opinions and citing what evidence they have? I don’t think so, even though I personally think they’re dangerously misinformed. And, as far as I know, Facebook, Twitter and most other social networking sites would allow people to express whatever opinion they hold on this subject.
Climate change is a policy issue that will, over time, affect all of us. Choosing not to vaccinate your child because you read somewhere that it might cause autism is an immediate issue that can jeopardize the child and put others at risk. So, if parents are prompted to avoid vaccinating their children based on what they read on social media, the social media companies arguably have some responsibility to correct the record and protect those children.
And this leads to Facebook’s new policy. In a blog post, Monika Bickert, Facebook’s vice president of global policy management wrote that the company is “working to tackle vaccine misinformation on Facebook by reducing its distribution and providing people with authoritative information on the topic.” Facebook is reducing the ranking of groups and pages that spread misinformation about vaccinations and rejecting ads that include such misinformation.
Bikert said that Facebook won’t show or recommend content that contains misinformation about vaccinations and that the company is “exploring ways to share educational information about vaccines when people come across misinformation on this topic.”
In that last area, it’s a matter of fighting false information with accurate information.
Facebook isn’t alone. On Wednesday, Amazon announced that it’s no longer selling books that promote fake cures for autism such as bathing in bleach.
This strikes me as a reasonable approach. Facebook can’t police everything that is said on its platform, and it can’t singlehandedly stamp out false information. And while some will no doubt call the action censorship, the site is not banning people from spreading hoaxes. It is reducing the likelihood people will see them and refusing to take advertising money to help spread them. I admit that reducing the ranking of pages and groups that spread this misinformation is bordering on censorship, but I still support the move because of the nature associated with these lies. Having said that, we are living in what some have called a “post-truth” world where even high-ranking government officials have been known to tell lies to groups of supporters who believe them to be true and act on them as if they are true.
I have made a personal decision to never share or spread information, regardless of where it fits on the political spectrum, unless I have very good reason to believe it’s true. And, while I recommend that approach to others, I do worry about going too far, especially when it comes to political truth.
The words of Daniel Patrick Moynihan are worth repeating: “Everyone is entitled to his own opinion, but not his own facts,” but when people firmly believe that lies are truths, it’s hard to convince them that it’s legitimate to deny them a forum to spread what they consider to be true. And therein lies a dilemma for social media companies.
Disclosure: Larry Magid is CEO of ConnectSafely.org, a nonprofit internet safety organization that receives financial support from Facebook and other tech companies.