SCOTUS alert! Section 230, free speech showdown looming

If you like your internet, you may not be able to keep your internet.

Democrats and Republicans have been taking aim at Section 230 through a myriad of legislative packages over the past couple of years, but fortunately none of them are very good shots, if you will. However, the first of several Supreme Court legal challenges on this front is positioned to hit the US Supreme Court on February 21. All Americans should be on high alert.

If you like your internet, you may not, in fact, be able to keep your internet.

The first case is called Gonzalez v. Google. The plaintiffs are the family of Nohemi Gonzalez, who was killed in Paris in 2015 at the hands of ISIS sympathizers in a series of terrorist attacks they carried out in the city. According to their argument, they believe Google should be held financially responsible for this tragedy.

Why, you may ask? They say the perpetrators of this attack were radicalized in their ideology through online videos they watched, made by ISIS, and hosted on YouTube. Not only do they say Google is responsible for allowing these videos to exist on their platform, they also attest that their algorithm showed the terrorists more and more content within this genre (which tracks, algorithms are programmed to show you more of what you like and will watch in order to keep you on a platform longer). Therefore, they insist, Google should be held financially liable for the death of their loved one.

It’s a bit of a stretch, we know, and almost certainly there are monetary interests behind the move. Google is one of the most profitable companies in the world, and lots of people would like to get their hands on some of that. But their lawsuit plays right into the hands of anti-Section 230 activists who have been waiting for just such a case.

Section 230 is one of the best pieces of legislation written, many credit it as the “26 words that built the internet,” because without it, the online web as we know it would look very different. Though often maligned by politicians and credited with far more issues than it actually creates, the short law merely states what should be obvious: you are responsible for what you say online and third parties are not.

YouTube player

Thus, because of Section 230’s existence, Google cannot currently be held liable to a murder carried out in France, nor can it be held liable if videos on its platform impacted those actions. As many others have pointed out, there is no “algorithm acceptance” within Section 230.

And that’s exactly how it should be. Our society increasingly wants to blame everything but the actual perpetrator for violent attacks, whether it be guns, or entertainment, or even websites. Usually this is due to ulterior motives (people already want to target these scapegoats and by attaching them to violent crimes they attempt to silence dissents).

‘How dare you not care about the children, or the victims, or the young minds at stake’ they might whisper just beneath the surface of their arguments. ‘If you don’t lie the sacrificial lamb on the altar you’re a heathen.’

But really, while the incident the case revolves around is horrid, there is no reason that Google should be held liable here. A million things can influence a criminal and their actions, some of them obvious, some much more mundane. While platforms like Google already spend considerable resources on content moderation that removes portrayals of actual violence, they can’t be expected to catch every seedy thought muttered online.

Should the court rule against Google in this case, it would be striking a tremendous blow in Section 230 protections that could radically reshape the internet overnight. If companies can be held liable not just for things third parties say on their websites, but also for things people do after absorbing things third parties say on their websites, very little will be allowed to go live.

Companies would have to moderate content far more strenuously, create lengthy wait times for posts, and have a system that only allows vetted individuals to create content. No more real time information, no more independent media, and far, far more censorship.

In a world where a significant portion of the culture literally believes “words are violence” and that Donald Trump speeches lead to hate crimes, this line of argument could quickly end us up in a world where virtually no content is allowed to stand online.

We’re sorry for the loss of the Gonzalez family. It’s hard to fathom what it must be like to confront such a tragedy. But their current actions are an assault on the American people and basic civil liberties.

Hannah is a consultant at NetChoice, which works on these issues.

Like this article? Check out the latest BASEDPolitics podcast on Apple Podcasts, Spotify, or below:

Sign up for Our Email List

* indicates required
*By signing up for our email you consent to getting our emails directly in your inbox. These including our newsletter or other informational emails*

Our Latest Podcast

Related articles

Rand Paul: Why did the Left abandon its defense of free speech?

Rand Paul asked a great question. https://twitter.com/RandPaul/status/1730256213059776625 The answer is simple:...

Dr. Fauci set for dramatic showdown in Congress

Go ahead and pencil January 8th and 9th into...

Javier Milei proved that libertarian populism can win

In 2008 and 2012, Republican Congressman Ron Paul ran...

Don’t fall for ‘woke’ misinformation about the origins of Thanksgiving

Who doesn’t love Thanksgiving ? Woke social media activists, apparently. Every time this...
Hannah Cox
Hannah Coxhttp://based-politics.com
Hannah Cox is a libertarian-conservative writer and co-founder of BASEDPolitics. She's also the host of the BASEDPolitics podcast and an experienced political activist.