Elon Musk’s takeover of Twitter raises the issue of social media content moderation in a particularly urgent way. Regardless of the UK and European Union regulations that Musk’s Twitter must comply with, no legal requirements will prevent Musk from running Twitter according to whatever editorial policy he chooses to adopt. It’s his sweet shop.
How is this possible? Can it really be true that the content moderation policies of such a powerful forum for public discourse should be subject to the whims of its new billionaire owner? Evan Greer, a political activist with Fight for the Future, speaks for many of us when she says, “If we want to protect free speech online, then we can’t live in a world where the richest person on Earth can just buy a platform that millions of people depend on and then change the rules at will.”
But that’s how television, newspapers and radio work in liberal democracies. Media owners determine the political line of the news and commentary they distribute. When NBC, CNN, ABC or the New York Post change owners, as they often have in the past, their new owners dictate operational rules and editorial policy. Social media is media and the same ownership prerogatives apply. Content moderation is their editorial policy and determined by their owners. No liberal democracy will mandate what owners can do or what their editorial policy should be.
Of course, some speech is illegal, and increasingly social media companies will be expected to keep their systems free of illegal material. The UK and EU create new liability regimes for illegal speech in their pending legislation, and Musk has promised to comply with these legal requirements.
But most hate speech, misinformation and racist attacks on social media are legal both here in the US and in Europe. Musk will have to comply with new EU and UK laws dealing with harmful but legal speech; this will mean more risk assessments, transparency reports, audits, access to data for researchers, publication of content moderation standards and due process requirements.
These new laws will impose vital public protections of transparency. It would be desirable to adopt significant elements of them here in the US, but they will not dictate Elon Musk’s approach to content moderation on Twitter. They still allow him to allow his system to be filled with harmful material if he chooses.
So what is Musk likely to do with Twitter? He presents himself as a philanthropic steward of a public resource. In an interview on stage at the TED2022 conference, Musk said, “this is not a way to make money. My strong intuitive feeling is that having a public platform that is maximally reliable and broadly inclusive is critically important to the future of civilization. I don’t care about the economy at all.”
it displayed wants to allow all legal speech on the platform, and that has raised concerns that it will weaken content moderation in the name of free speech. But Wall Street Journal opinion columnist Holman W. Jenkins Jr. sums up the current situation that “Twitter has crossed the river of no return in ‘moderating’ the content that appears on its service – it cannot allow unfettered free expression”.
Just because someone should moderate content on Twitter, however, doesn’t mean Twitter should. Musk may outsource work to Twitter users or third parties.
Influential neo-right blogger Curtis Yandex has urged Musk to adopt a user-curated approach to content moderation. The new Twitter under Musk, he says, must censor “all content prohibited by law in all jurisdictions that prohibit it.” For content moderation and algorithmic recommendation of legal speech, Yandex asks Musk to seek to identify hate speech and hate speech other users may not want to see, and then give users the tools to block it that if they wish. The goal should be to adjust content moderation and algorithmic recommendation to give users what they want, to make their experiences “as rich and enjoyable as possible.”
This idea still leaves Twitter responsible for identifying harmful material that users may not want to see. But there may be a way to transfer it as well.
Musk says he wants to make Twitter’s algorithms “open source to increase trust.”
Twitter’s recommendation algorithm “should be on GitHub,” he noted. This can mean more than just letting users examine the algorithm to see how it works. Users can modify Twitter’s open source algorithm in any way they choose.
This raises an interesting possibility for the future of Twitter. Musk may be considering adopting the approach to content moderation recommended by political scientist Francis Fukuyama. This “middleware” approach would install an “editorial layer” between a social media company and its users. It would outsource “content curation” to other organizations who would take all of the platform’s feed and filter it according to their criteria and then make that curated feed available to their users.
Musk’s talk of providing all legal speech would then apply to Twitter’s core feed. Content moderation beyond this would be outsourced to users and third-party content curation service providers.
There’s no way to know at this point if Musk intends to move toward this user-centric approach to content curation. This issue is so loaded that external content moderation may be worth the experiment. My feeling is that it seems far-fetched. It is far from clear that it is technically feasible, and there is no discernible way to generate revenue to pay for the moderate costs involved. Any middleware provider of content curation services would have to replicate a large infrastructure of software and human moderators, which seems economically unacceptable.
Additionally, as Stanford University legal scholar Daphne Keller has noted, privacy issues must be addressed. Does the middleware provider have access to all material posted by a user’s friends and followers? If so, then that intrudes on the privacy of these other users who may want nothing to do with that middleware provider. If not, then how can the middleware provider effectively filter the news feed?
More importantly, this idea is not a way to foster genuine exchange between citizens on matters of public importance. It’s more of a recipe for us to retreat into our corners, creating filter bubbles of like-minded people and excluding the rest of society.
Setting ourselves apart so that we don’t have to listen to people who differ from us is not a remedy for the information externalities that make hate speech and misinformation so dangerous even for people who are not exposed to it. People cannot remain indifferent to what other people in society believe, because what other people believe affects them. If enough people refuse vaccines and other public health measures, we are all at risk from the next pandemic. If enough people become racist or intolerant towards the LGBTQ community, significant parts of our community are not safe in their own society. And how will we agree on what to teach our children if there is no uniform public platform where we can exchange ideas?
The big advantage of Musk taking over Twitter is that it will focus attention on new ways to improve content moderation. The frustration, for many, is that aside from offering advice and demanding transparency, there isn’t much the public or policymakers can do to influence Musk’s decision about what to do with his new candy store. He owns the platform and as is the case in the business world in general, he is free to make whatever decisions he wants.