Today NBC's Meet the Press devoted its Sunday morning hour entirely to a discussion of social media: what's wrong with the way things are now on Facebook, TikTok, Instagram, Twitter, and such, along with what needs to be done to fix those problems.
It's a complicated subject with no easy answers. Here's what stuck in my mind after watching the program.
Algorithms. Social media sites typically use computer algorithms to determine what gets shown to users. (Twitter is an exception, at least if a third party app is used to access Twitter, which I do. I only see tweets from the people and organizations I follow, and I see all of their tweets, in chronological order.)
This encourages anger and divisiveness, since negativity generally draws more views than positivity, cute animal videos notwithstanding. Several Meet the Press guests called for more transparency in how the algorithms work. But it's unclear what this means.
A Republican congressman, who made good sense, argued that government regulation of the algorithms could cause more problems that it solves. Who decides what is divisive or misinformation? Even hate speech, while easy to decry, is difficult to pin down. Stand-up comedians often make jokes about "sacred cows" in our culture. Are those jokes hate speech or free speech?
Maybe I have a libertarian side when it comes to social media, but I'm wary of regulations that tell social media companies what content to prioritize to their users. So what's the alternative?
Market forces. I agree with several people on this special Meet the Press program who said that if social media companies don't operate in a fashion that users like, many of those users will head elsewhere. To some extent this has happened with Twitter after Elon Musk took over and mostly dismantled company efforts aimed at curbing extreme hate speech and harmful disinformation.
The woman who was a Facebook whistleblower in 2021, I think it was, tried to make an analogy between government requiring seat belts and other safety equipment in cars and regulation of social media companies.
However, it's a bit of a stretch to do this.
Cars are an essential part of life for most people. Facebook, Instagram, TikTok, and Twitter aren't. Sure, they're addictive, useful, fun, and entertaining. But social media doesn't kill tens of thousands of people a year. And many social media users enjoy them just the way they are.
If someone finds Facebook, or whatever, distasteful, they can stop using it. When enough people do this, the social media company will realize that the market has spoken and changes need to be made.
Section 230. This is the part of the Communications Decency Act that prevents social media providers from being responsible for content put up by users. Here's how the Electronic Frontier Foundation describes Section 230.
Section 230 says that "No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider" (47 U.S.C. § 230). In other words, online intermediaries that host or republish speech are protected against a range of laws that might otherwise be used to hold them legally responsible for what others say and do.
The protected intermediaries include not only regular Internet Service Providers (ISPs), but also a range of "interactive computer service providers," including basically any online service that publishes third-party content. Though there are important exceptions for certain criminal and intellectual property-based claims, CDA 230 creates a broad protection that has allowed innovation and free speech online to flourish.
Most of the Meet the Press panel members seemed to think that keeping Section 230 is a good idea. I agree. It would be a nightmare if social media companies were legally responsible for all of the content uploaded by its users. This likely would lead to increased censorship of that content, since social media companies would want to be sure that they couldn't be sued by someone upset with user-generated content.
A better approach, someone said, is what happened with Alex Jones. He was successfully sued by parents of children killed in the Sandy Hook massacre after Jones repeatedly and loudly claimed the children weren't really dead. And Fox News is being sued by Dominion voting machines after hosts on the network said that the machines produced fraudulent election results.
So in egregious cases the legal system can be used to punish libel. There's no need to make social media companies watchdogs over everything users upload on their platforms.
Section 230 is meaningful to me because I'm an extremely small social media platform, having three blogs, two of which date back to 2003 and 2004. I've always allowed comments on my blogs. I go back and forth on moderating comments. Mostly I haven't moderated them, though occasionally I'll delete a comment that doesn't fit with my comment policies.
Here's what the Electronic Frontier Foundation says about blogs and Section 230.
CDA 230 also offers its legal shield to bloggers who act as intermediaries by hosting comments on their blogs. Under the law, bloggers are not liable for comments left by readers, the work of guest bloggers, tips sent via email, or information received through RSS feeds. This legal protection can still hold even if a blogger is aware of the objectionable content or makes editorial judgments.
Thus I'm very much against Congress messing around with Section 230, unless the changes were minor and didn't alter its free speech provisions.
Recent Comments