Parler, Section 230, and the Future of the Internet

Parler, the upstart social network that bills itself as Twitter but with more free speech, is having a moment.

Capitalizing on many conservatives’ growing distrust for Silicon Valley, and aided by a burst of recent high-profile endorsements from right-wing politicians and pundits, the app, which was launched in 2018, has doubled its online presence over the last week. As of May 2019, the app had about 100,000 users, which had grown to about 1 million a year later. Now, its founder says, it’s pushing 2 million, with more coming every day. It’s not Facebook, but it’s not chump change either.

The appeal of Parler (which is pronounced like the pirate truce, from the French “to speak”) is twofold. First, there’s the elevator pitch: Parler describes itself as a network that won’t regulate your speech. If the government couldn’t keep you from saying it, Parler won’t either. “We’re a community town square, an open town square, with no censorship,” founder John Matze told CNBC last week. “If you can say it on the streets of New York, you can say it on Parler.”

Then there’s the subtextual pitch: Here is a place that will be a safer, more congenial space for conservatives to break free of the tyranny of the hostile lefties of Silicon Valley. The site’s de facto rightward bent is apparent the second you actually step onto Parler. Make a Twitter account, and the app might suggest you follow ESPN, NASA, Cardi B, Donald Trump, and the Pope. Make a Parler, and you’ll be advised to follow Breitbart, the Daily Caller, Epoch Times, Dan Bongino, Eric Trump, Laura Loomer, and Rudy Giuliani.

These two corollary but distinct functions for Parler have been responsible for some of the growing pains the site has experienced after suddenly becoming a semi-prominent part of the national conversation. Hundreds of thousands of new Parler accounts last week also meant an influx of trolls, who immediately began besieging the site’s other users with insults, crude imagery, and other spammy behaviors. Forced to make a choice between reaffirming that commitment to allow all constitutional speech and maintaining a palatable user experience, Parler began inching toward the latter.

In a post on the site on Tuesday, Matze enumerated a host of behaviors that would earn a Parler ban: posting “pictures of your fecal matter,” having an obscene username “like CumDumpster,” pornography, and death threats.

“We have some trolls coming in, and they’re welcome to troll as long as they follow the rules,” Matze told radio host Dana Loesch on Wednesday. “If your account literally consists of 100 comments of the F-bomb, that’s not really contributing, so we’re not going to keep that.”

Naturally, Parler has received its fair share of mockery for this pivot, since having the suits running a social media company deciding what should and what shouldn’t as “contributing” is ostensibly exactly the sort of thing Parler defines itself against. As TechDirt’s Mike Masnick points out, plenty of other things on that list run afoul of Parler’s stated intention of allowing all constitutional speech as determined by the Supreme Court.

The chief difficulty here is that trying to get people on the internet to actually behave as if they were in a public place has been universally acknowledged to be a Sisyphean task since the web’s earliest days. If you get in an argument on the streets of New York, the odds aren’t high your interlocutor will be wearing a T-shirt that says “CumDumpster” or pull a picture of a pooping farm animal out of his wallet and wave it in your face. But both of these things would still be legal.

Still, all this can be chalked up to the growing pains of a young site. What’s more interesting than the ways in which Parler is shirking its own mission statement to take some posts down is the way its stated free-speech principles inform what’s left up.

The conservative effort to break the grip of big social media companies on the public discourse is, of course, not limited to entrepreneurial efforts like Parler’s. A growing number of GOP lawmakers and pundits are pushing to revise the law to force companies to take a more hands-off approach to policing content.

Sen. Josh Hawley, the freshman lawmaker from Missouri, has rapidly built himself a national profile on the strength of his promises to take the fight to Big Tech. Fox News’s Tucker Carlson, the most-watched cable pundit in America, regularly takes up the same pitchfork in monologues denouncing social media censorship. And President Trump has jumped fully aboard the train this year following a series of episodes in which Twitter fact-checked or otherwise wagged an institutional figure at some of his wilder posts.

Their primary target has been Section 230, a provision passed as part of the 1996 Communications Decency Act that gives companies like social networks broad leeway to impose their own standards for what constitutes acceptable content in posts to their site. Cybersecurity expert Jeff Kosseff, who wrote an entire book about the law, dubbed it “the twenty-six words that created the internet,” and that’s hardly an overstatement: Section 230 established the framework for the modern web by codifying a critical distinction between internet companies who host content and the users who use those companies’ sites to create content of their own.

Before Section 230, companies that created platforms for user content could not moderate that content whatsoever without opening themselves up to substantial legal liability. By performing content moderation, they were legally assumed to be taking ownership of all the content they did not moderate, and could thus be viewed as accessory to illegal speech—death threats, libel, and so on—that took place on their platform. Section 230 reversed that structure: So long as companies made a good-faith effort to moderate content on their platform according to their own internal guidelines for what was appropriate, they could not be held criminally liable for content posted by other users to that platform.

Most recent GOP efforts to hamstring companies like Google, Facebook, and Twitter have focused on attaching new strings to Section 230’s liability protection. A bill introduced last week by Georgia Sen. Kelly Loeffler, for instance, would permit users to sue social media companies for unfair censorship if those companies’ content moderation policies excluded constitutionally protected speech.

It’s important to note, too, that these anti-230 policy pushes are generally linked to another issue of online content: the availability of online pornography. In many GOP policy circles, the two issues go hand-in-hand: Bills that would cut into platforms’ ability to make their own content regulation decisions also place stricter commands on them to ensure illegal acts like the exposure of children to sexually explicit material does not prosper on their sites.

The Loeffler bill is instructive here, conditioning 230 immunity on a platform’s willingness to “take reasonable steps to prevent or address” unlawful activity on the site. The Department of Justice has proposed Section 230 reforms along similar lines, calling for platforms to lose 230 immunity with regard to “specific categories of claims that address particularly egregious content,” including “child exploitation and sexual abuse.” Many advocates suspect this sort of thing would force sites to crack down much harder on explicit material in general, making it much more difficult to access via social media sites.

In other words, a site like Parler, with its laissez-faire attitude toward most speech combined with strict rules against obscenity, isn’t just a curio of the current state of movement conservatism. It’s also a glimpse of a potential future in which the likes of Hawley and Loeffler have their way.

Parler’s aims “do mostly line up with what the Department of Justice, Sen. Hawley, and Sen. Loeffler have proposed,” said Jon Schweppe, director of policy at the American Principles Project, a social conservative organization at the tip of the spear on efforts to reform Section 230. If the DoJ proposal became law, then, “I think it’s mostly right to say that you would see Facebook, Google, Twitter, and other platforms adopt similar terms of service to what Parler currently is using.”

You don’t have to spend too much time clicking around on Parler to see why that might be a concerning prospect. Porn and poop might be out, but what replaces them is monsoons of bigotry, conspiracy-mongering, and—of course—spam. You can’t turn around on the website without barking your shins against content promoting QAnon, Pizzagate, or Hitler-profile-picture accounts with names like “Slandered Fuhrer” declaring the U.S. to be “a n—er-infested shithole country.” Is that the kind of internet we want?  

Photograph by Al-Drago-Pool/Getty Images.

Comments (56)
Join The Dispatch to participate in the comments.
Load More