Social media companies are caught between a rock and a hard place. On the one hand, they are rightfully concerned about being the arbiters of what should and should not be posted on their platforms (i.e. seen by the public) and do not want to play that role because it is hard to do and it is likely that no one will be pleased with the job they do. On the other hand, clearly toxic material is posted daily. And be clear, the issue is not only the “clearly toxic” material. The problem is that there is a lot of material that is not clearly (meaning agreed by everyone) toxic.
Why is this a problem. Well, if a person is alone in a movie theater and falsely yells “fire”, no harm no foul. Conversely, if the theater is crowded and a member of the audience yells “fire” when none exists, there is a likelihood that someone will get hurt. In 1919 (Schenck v. United States) the Supreme Court put limits on free speech. And while Brandenburg v. Ohio put some limits on Schenck v. United States, the rule fundamentally stands that free speech is an abridged right.
There are several things that complicate the situation:
- Social Media companies are private entities and can set their own rules. But they appear never to have wanted to do this. Setting rules is complicated, it is expensive, and it results a situation where no one is happy, which is not a good recipe for happy engaged users.
- Section 230 of the Communications Act of 1934 as amended was designed to allow the platforms to be used as stages for people to voice their opinions leaving the social media companies (happily) out of the business of being the arbiters of what is and is not toxic.
- Social Media platforms are capable of efficiently (read cheaply) amplifying voices. This makes them magnets for anyone who wants to get their message out no matter how fringe.
- There are no effective (easy to use, right 95%+ of the time, easy to understand) tools to allow users to be critical consumers of the material they see on these platforms.
- Our society does not do a good job of holding people responsible for what they say. Within broad limits, government (through direct action or laws that allow harmed parties to redress their grievances) does not provide effective mechanisms to hold parties accountable for what they say. And the anonymity afforded people who post on social media removes inhibitions against bad behavior that might be present if the parties were known.
- There is a lot of sloppiness in the assemblage of facts and imprecise communication ideas based on questionable facts even by persons who: 1) should know better; and 2) are fundamentally trying to do a good job at communicating their message in a responsible way. This general sloppiness makes it easier for people who are either incompetent, malevolent or both to do their damage.
Any solution to the issues associated with “free speech” and social media platforms are best addressed in the context these four principles:
- People should be able to speak their piece;
- There should be platforms for them to do so;
- People should be held responsible and accountable for what they say and if they were, they might be more responsible about what they say in the first place; and
- The right to free speech has limits.
Implementing these principles (which need to account for the “things that complicate the situation” as outlined above) will not be as easy as mandating that the social media platforms fix the problem, but it will result in a robust long-term solution. And it will be a robust solution because it will support an open society by promoting diverse ideas, allow them to simultaneously exist side by side and be improved by the discourse encouraged by posting these divergent ideas in forums that have broad audiences.
— you can find this (days earlier) and other posts at www.niden.com.
And, if you like this post: 1) please let me know; and 2) pass on your “find” to others.