Finally, a Federal Court Hits Back Hard at Big Tech on Section 230
This week saw a dramatic turn in our nation’s desperate efforts to clean up the increasingly poisonous online sewers that we call social media.First, the backstory. If you publish a newspaper or newsletter and you publish “illegal” content—encourage crimes or homicide, offer to sell drugs, promote child porn, advocate overthrowing a government—you can go to jail. If you publish things that defame or lie about persons or corporations, you can be sued into bankruptcy.If you own a bookstore or newsstand and distribute books, magazines, and newspapers and offer for sale illegal content—child or snuff porn, stolen copyrighted material, instructions for making illegal drugs or weapons—you can also go to jail. And if you sell materials that openly defame individuals or corporations, you can be sued into bankruptcy.In the first category, you’d be a publisher. In the second, you’d be a distributor.But what is social media? Particularly those types of social media that use an algorithm to push user-produced content out to people who haven’t explicitly asked for it?Twenty-eight years ago, social media sites like CompuServe and AOL were regulated as if they were publications, with the occasional secondary oversight as if they were distributors. They had an obligation to make sure that illegal or defamatory content wasn’t published on their sites, or, if it was, to remove it within a reasonable time period.The internet back then was a relatively safe and peaceful place. I know, as I ran a large part of one of the largest social media sites that existed at the time.But then things got weird.Back in 1996, some geniuses in Congress thought, “Hey, let’s do away with the entire concept of the publisher or distributor having responsibility for what happens in their place.”Seriously. Selling drugs, trading in guns and ammunition, human trafficking, planning terrorist attacks, overthrowing governments, sparking genocides, promoting open lies and naked defamation. All good. No problem.No matter what happens on a social media site, Congress said, its owners and managers bear no responsibility whatsoever for what’s published or distributed on and from the site. None. They’re untouchable. They take the profits but never have to worry about being answerable for the damage their site is doing. Sounds crazy, right?But that’s exactly what Congress did with Section 230 of the Telecommunications Act of 1996 in what they thought, at the time, was a good-faith effort to help the brand-new internet grow in a way they hoped would eventually become an important and useful social good. It sure hasn’t worked out that way. And, as noted, it wasn’t always this way in the years before 1996.Back when the internet started, but before hypertext markup language, or HTML, was invented, there were really only two big “houses” on the internet: CompuServe and AOL. My old friend and business partner Nigel Peacock and I ran “forums” on CompuServe starting around 1979, right up until 1996.We ran the IBM PC Forum, the Macintosh Forum, and about two dozen other “social media forums” where people interested in ADHD, UFOs, the Kennedy assassination, international trade, spirituality, and a bunch of other topics could log in and discuss. CompuServe paid us well, because we had to make sure nothing criminal happened in any of the forums we ran. We even had to carry our own liability insurance. And we split the revenue with the 20 or so people who worked with us.We kept those places open and safe, as did hundreds of other “Sysops,” or Systems Operators, who ran other forums on CompuServe and AOL. After all, these were CompuServe’s and AOL’s “publications” or “bookstores,” and the companies were paying us to make sure nothing illegal happened inside it. Until 1996, that is. That year, after Section 230 became law, CompuServe decided they no longer needed to pay Sysop moderators to keep their forums clean and crime-free, so they quit paying us. Most of us left.The result of Section 230 of the Telecommunications Act of 1996 is obvious today. The attack on our Capitol was largely planned on social media and the internet more broadly, where you can also buy ghost guns, other people’s credit card numbers, drugs, and illegal porn. Social media sites now run algorithms that choose specific content from users to push out to other users to keep them “engaged” so they’ll have maximum exposure to advertisements, which have made the owners of the sites into billionaires.In 1997, in the case Zeran v. America Online, the Fourth Circuit Court of Appeals ruled that Section 230 is written so tightly that even when an online service knowingly allows lawbreaking, it can’t be held accountable. More recently, last month in Moody v. NetChoice, the Supreme Court ruled that social media companies are also protected, like newspaper publishers are, by the First Amendment. They essentially affirmed Zeran, writing:“[A] platform’s algorithm that reflects ‘editorial judgments’ about ‘compilin
This week saw a dramatic turn in
our nation’s desperate efforts to clean up the increasingly poisonous online
sewers that we call social media.
First, the backstory. If you publish a newspaper or newsletter and you publish “illegal” content—encourage crimes or homicide, offer to sell drugs, promote child porn, advocate overthrowing a government—you can go to jail. If you publish things that defame or lie about persons or corporations, you can be sued into bankruptcy.
If you own a bookstore or newsstand and distribute books, magazines, and newspapers and offer for sale illegal content—child or snuff porn, stolen copyrighted material, instructions for making illegal drugs or weapons—you can also go to jail. And if you sell materials that openly defame individuals or corporations, you can be sued into bankruptcy.
In the first category, you’d be a publisher. In the second, you’d be a distributor.
But what is social media? Particularly those types of social media that use an algorithm to push user-produced content out to people who haven’t explicitly asked for it?
Twenty-eight years ago, social media sites like CompuServe and AOL were regulated as if they were publications, with the occasional secondary oversight as if they were distributors. They had an obligation to make sure that illegal or defamatory content wasn’t published on their sites, or, if it was, to remove it within a reasonable time period.
The internet back then was a relatively safe and peaceful place. I know, as I ran a large part of one of the largest social media sites that existed at the time.
But then things got weird.
Back in 1996, some geniuses in Congress thought, “Hey, let’s do away with the entire concept of the publisher or distributor having responsibility for what happens in their place.”
Seriously. Selling drugs, trading in guns and ammunition, human trafficking, planning terrorist attacks, overthrowing governments, sparking genocides, promoting open lies and naked defamation. All good. No problem.
No matter what happens on a social media site, Congress said, its owners and managers bear no responsibility whatsoever for what’s published or distributed on and from the site. None. They’re untouchable. They take the profits but never have to worry about being answerable for the damage their site is doing.
Sounds crazy, right?
But that’s exactly what Congress did with Section 230 of the Telecommunications Act of 1996 in what they thought, at the time, was a good-faith effort to help the brand-new internet grow in a way they hoped would eventually become an important and useful social good.
It sure hasn’t worked out that way. And, as noted, it wasn’t always this way in the years before 1996.
Back when the internet started, but before hypertext markup language, or HTML, was invented, there were really only two big “houses” on the internet: CompuServe and AOL. My old friend and business partner Nigel Peacock and I ran “forums” on CompuServe starting around 1979, right up until 1996.
We ran the IBM PC Forum, the Macintosh Forum, and about two dozen other “social media forums” where people interested in ADHD, UFOs, the Kennedy assassination, international trade, spirituality, and a bunch of other topics could log in and discuss.
CompuServe paid us well, because we had to make sure nothing criminal happened in any of the forums we ran. We even had to carry our own liability insurance. And we split the revenue with the 20 or so people who worked with us.
We kept those places open and safe, as did hundreds of other “Sysops,” or Systems Operators, who ran other forums on CompuServe and AOL. After all, these were CompuServe’s and AOL’s “publications” or “bookstores,” and the companies were paying us to make sure nothing illegal happened inside it.
Until 1996, that is. That year, after Section 230 became law, CompuServe decided they no longer needed to pay Sysop moderators to keep their forums clean and crime-free, so they quit paying us. Most of us left.
The result of Section 230 of the Telecommunications Act of 1996 is obvious today. The attack on our Capitol was largely planned on social media and the internet more broadly, where you can also buy ghost guns, other people’s credit card numbers, drugs, and illegal porn.
Social media sites now run algorithms that choose specific content from users to push out to other users to keep them “engaged” so they’ll have maximum exposure to advertisements, which have made the owners of the sites into billionaires.
In 1997, in the case Zeran v. America Online, the Fourth Circuit Court of Appeals ruled that Section 230 is written so tightly that even when an online service knowingly allows lawbreaking, it can’t be held accountable.
More recently, last month in Moody v. NetChoice, the Supreme Court ruled that social media companies are also protected, like newspaper publishers are, by the First Amendment. They essentially affirmed Zeran, writing:
“[A] platform’s algorithm that reflects ‘editorial judgments’ about ‘compiling the third-party speech it wants in the way it wants’ is the platform’s own ‘expressive product’ and is therefore protected by the First Amendment.”
Mark Zuckerberg, who owns one of those “publications,” has become one of the richest men on the planet because he doesn’t have to pay for content moderation. Twitter made a few billionaires, too, before Elon Musk turned it into a right-wing disinformation machine.
Nonetheless, Section 230 lives on. I wrote a book that covers it, The Hidden History of Big Brother: How the Death of Privacy and the Rise of Surveillance Threaten Us and Our Democracy.
So did Josh Hawley, the Republican senator from Missouri who hopes to be the next Trumpy president, and his book’s take is pretty much the same as mine: Section 230 is extremely problematic, at the very least.
Which brings us to this week’s big news. For the first time, a federal appeals court (the Third Circuit, seated in Philadelphia) has ruled that because Section 230 largely deals with social media sites as “publishers,” that doesn’t protect them as “distributors” (like bookstores).
In this case, a 10-year-old girl received a TikTok video—pushed out to her by that company’s algorithm—for a thing called the “blackout challenge” where people see how long they can cut off their own breathing or blood supply before blacking out. Little Nylah Anderson tried the challenge and accidentally asphyxiated herself to death.
Her mother sued in Pennsylvania for negligence and wrongful death, using state product liability laws as the basis for her suit. From there it went to federal court, where Anderson v. ByteDance ended up before the Third Circuit.
Two Trump-appointed and one Obama-appointed judges ruled unanimously that ByteDance, which owns TikTok, isn’t “publishing” a social media site but—because their algorithm “curates” content and sends its choices, unsolicited, to users on the site—is actually “distributing” content.
In other words, social media sites are bookstores, not newspapers. And online “bookstores,” they ruled, are not protected by Section 230.
The case is far from settled; from here it’ll go to the Supreme Court, and its fate there is hard to predict given the court’s embrace of the First Amendment argument in previous cases.
And the social media companies, raking in billions in profits, have so far stymied all efforts to do away with Section 230 or regulate their behavior by taking advantage of the Supreme Court’s legalization of political bribery: Silicon Valley is one of the larger players in the lobbying and campaign contributions game.
Nonetheless, there’s now a very real possibility, even absent of congressional action, that social media companies may end up back where AOL and CompuServe were before 1996, having to hire thousands of content moderators to protect themselves from both criminal and civil action.
Europe is moving in this direction too, with the arrest in France last week of the Russian founder of Telegram, a social media channel where human trafficking and other criminal activity were both common and known to the Systems Operators.
If Zuckerberg and his peers have to start hiring people to do what Nigel and I did for CompuServe years ago, it may reduce their income from tens of billions to mere billions. They probably won’t even notice the difference.
And society will be the better for it, our political landscape will stabilize, and fewer children will die.
That’s a trade-off that’s well worth making. But why wait for the Supreme Court (which may well not agree with the Third Circuit)? Congress can also repeal or substantially rewrite Section 230.
It’s time to consign Section 230 to the trash heap of history and begin to hold accountable and regulate these toxic behemoths to the same simple standards to which magazines, newspapers, and newsletters like this and bookstores must adhere.