Drew Angerer/Getty Images
A new defamation trial for conspiracy theorist Alex Jones, which began this week, may offer nuggets of insight into the effectiveness of “deplatforming” — the booting of unwanted accounts from social media sites.
The trial, in Connecticut, is the second of three trials Jones faces for promoting lies on his broadcast television show and website Infowars that the 2012 Sandy Hook Elementary School shooting was a hoax. The families of the victims, who Jones called “crisis actors,” have faced harassment, threats and psychological abuse. In August, a Texas jury awarded the family members $45.2 million in damages, although Jones says he plans to appeal the decision.
Jones, a serial conspiracy theorist and fabulist, was removed from nearly all major internet and social media platforms in 2018 after threatening then-special counsel Robert Mueller, who was investigating then-President Donald Trump’s ties to Russia. Initially, a round of media coverage touted the spike in traffic to Jones’ websites as proof that “deplatforming works.” However, the revelations from Jones’ defamation trials may point to the existence of a rarefied class of extreme Internet personalities who are better protected by efforts to curb the reach of their content.
In the Connecticut trial, a corporate representative for Jones’ companies testified that Infowars may have generated anywhere from $100 million to $1 billion in revenue in the years since the Sandy Hook massacre. Testifying during the earlier trial in Texas, Jones told the court that Infowars earned about $70 million in revenue in the most recent fiscal year, up from an estimated $53 million in 2018, the year Infowars was widely declassified.
The difference between Jones and many of the other right-wing actors who have deplatformed, says political scientist Rebekah Tromble, who directs George Washington University’s Institute for Data, Democracy and Policy, “is that Infowars had an existing infrastructure outside of social media. “
Infowars makes about 80% of its revenue by selling products, mostly dietary supplements, according to court filings from Jones’ nine largest private companies. He grew his audience on talk radio with the help of an early partnership with a sympathetic distributor and now owns his own network and independent video streaming site.
A growing body of research suggests that deplatforming toxic actors or online communities typically significantly reduces audience size, with the caveat that this smaller audience migrates to less regulated platforms, where extremism is then concentrated, along with the potential for violence.
Measuring the effectiveness of deplatforming is tricky, in part because the word itself can refer to so many different things, says Megan Squire, a computer scientist who analyzes extremist online communities for the Southern Poverty Law Center.
“It’s losing your site infrastructure, it’s losing social media, it’s losing banking. So, like the big three, I’d say,” says Squire. She says they all had different impacts depending on the specific case.
Squire’s research shows that traffic to Jones Infowars’ online store remained stable for about a year and a half after it was removed from major social media sites. It then declined throughout 2020 until that year’s presidential election and its violent aftermath, when Infowars Store traffic saw a massive spike that reached levels Jones hadn’t seen since two years before deplatforming.
Jones’ resilience is more the exception than the rule, says Squire. It cites the case of Andrew Anglin, founder of the neo-Nazi website The Daily Stormer. After the violent 2017 Unite the Right rally in Charlottesville, Va., he lost his web domain and had to go through 14 others, losing traffic each time. Squire says Anglin is on the run from various lawsuits, which include a ruling that he owes $14 million in damages for terrorizing a Jewish woman and her family.
Post-deplatforming survival strategies
Even after social media bans, plotters like Jones find solutions. Squire says it’s common for other users to host the banned personality on their channels or simply repost the banned person’s content. People can rebrand, or they can direct their audience to an alternative platform. After being banned by companies including YouTube and PayPal, white supremacist live streamer Nick Fuentes eventually built his own video streaming service where he encouraged his audience to kill the legislators at the head of the January 6 riots at the Capitol.
Other Internet communities have shown similar resilience. A popular pro-Trump message board known as TheDonald was kicked off Reddit and later shut down by a later owner after the Capitol riots, and yet is now more active than ever, according to Squire. When Trump himself was banned from Twitter, Squire looked like the messaging app Telegram received tens of thousands of new users. It remains a thriving online space for right-wing celebrities and hate groups.
As for raising money, even if extremists are completely disconnected from financial institutions that process credit cards or donations, they can always turn to cryptocurrency.
“100% of these guys are in crypto,” says Squire, which, she notes, isn’t necessarily easy to live by. Its value is unstable and its collection is not always simple. However, Squire and her colleagues have found anonymous donors using crypto to pour millions of dollars into Jones and Fuentes.
“We live in a capitalist society. And who says entrepreneurs can’t also be on the conspiracy side of things?” says Robert Goldberg, a history professor at the University of Utah. He points out that conspiracy vendors have always been “very clever” with whatever new technology is available to them.
“The Klan headquarters in Atlanta, Georgia, would sell hoods and clothes and all this merchandise, this brand, this gift, if you will, to the 5 to 6 million people who joined the Ku Klux Klan in the 1920s,” he says. But aside from the KKK peak, Goldberg says, selling conspiracy material about the Kennedy assassination, UFOs or the 9/11 terrorist attacks has generally been much less profitable, until now.
Power and lies
A bigger question for researcher Shannon McGregor at the University of North Carolina’s Center for Information, Technology and Public Life is what the conspiratorial entrepreneurs hope to achieve with their achievement.
“Why are these people doing this in the first place? What are they getting out of it? And in many cases in this country in particular, at this moment, it’s about depending on power,” McGregor says. Marginal communities always exist in democracies, she says, but what should be of concern is their proximity to power.
She opposes a “both sides” framing of the issue, identifying it as a largely right-wing phenomenon dating back decades. “Since at least the Nixon era, this right-wing, ultraconservative media ecosystem has been aligned with political power, making it much more unlikely that it will actually disappear,” says McGregor.
Deplatforming and punitive defamation suits, she argues, are less of a remedy than “damage reduction.” When an individual conspiracy theorist or conspiracy site loses its audience, replacements quickly appear. None of this means, McGregor and other experts agree, that efforts to curb the spread of extremist or anti-democratic narratives should be abandoned altogether.
“I think in general, [social media company] representatives would prefer if the conversation became: ‘Oh, well, deplatforming doesn’t work, does it? … So, you know, that’s not our responsibility anymore,” Tromble says.
Squire says there’s no question that anything that makes it harder for toxic conspirators to operate smoothly or spread their message is worth doing. This makes the platform they leave safer and reinforces the social norm that has consequences for harassment and hate speech.