Members of Twitter’s Trust and Safety Council — a group of 100 organizations that work on issues including harassment, content moderation and suicide prevention on the platform — say they are unsure about their future and whether Elon Musk, the who took over Twitter last week, even knows they exist.
“Now, I feel like we’re in a different universe,” Danielle Citron, Vice President at the Cyber Civil Rights Initiative, told Motherboard. Citron said that despite one of the council’s regular meetings being on the calendar, her organization has not heard from Twitter and said Twitter staff appear to be “ghosting” them in updates.
Bloomberg reported on Monday that most people working in Twitter’s Trust and Safety organization are blocked from their access to internal tools used to moderate content and are “currently unable to modify or penalize accounts that violate rules around misleading information , offensive posts and hate speech,” citing anonymous sources familiar with the matter. Musk’s first act as the new owner of Twitter was the dismissal of its senior managers, including CEO Parag Agrawal, CFO Ned Segal, policy executive Vijaya Gadde and company general counsel Sean Edgett. Vijaya worked closely with the council, according to Citron.
Musk has said he wants to form his own content moderation council, with “very diverse viewpoints.” Musk tweeted last week that “no significant content decisions or account resets will occur before that council meets.”
Wednesday, he tweeted that he had talked with people at the Anti-Defamation League, Color of Change and the NAACP, among others, about “how Twitter will continue to fight hate and harassment and enforce its election integrity policies.”
After Musk took over Twitter, the platform saw one increase in hate speechaccording to Twitter’s head of Security and Integrity, Yoel Roth.
It is unclear where all this leaves the existing Trust and Security Council. Twitter did not respond to a request for comment about the council’s status.
“Unfortunately I’m not sure that Elon Musk knows about the existence of [Trust and Safety] council until now,” Alex Holmes, deputy CEO at the Diana Award Anti-Bullying Campaign, and a council member, told Motherboard. “The Twitter Trust and Safety Council is a dedicated and passionate global group comprised of unpaid representatives from NGOs, security, hate speech and free speech experts who are there to be critical friends. We have often given our advice on future products/tools, updates, security issues. We are not a board watchdog and we are not involved in any moderation decisions, instead we support a safe and healthy platform that is inclusive.”
Twitter formed the Trust and Safety Council in 2016 as “a new and fundamental part of our strategy to ensure people feel safe expressing themselves on Twitter”. according to her notice— with more than 40 organizations and experts from 13 regions making up its inaugural members. The council kept it first annual summit the following year at Twitter’s headquarters in San Francisco, where then-CEO Jack Dorsey attended and heard presentations from members. There are currently 100 organizations representing five different focus areas – content governance, suicide prevention, child sexual exploitation, online safety and bullying, and digital and human rights – listed in council website.
“I felt very connected, like I could always go to Vijaya,” Citron said. “It felt really responsible.”
Emma Llansó, director for the Center for Democracy and Technology’s Free Expression Project, and a board member, told Motherboard that her organization has not heard from Twitter since late September.
“From my experience, Council members are all committed to helping Twitter be more responsive to abuse and more transparent and fair in how they enforce their policies,” Llansó said. “There’s still a long way to go, but Twitter staff have made an ongoing effort to improve the experiences of its most vulnerable users. It’s hard to say exactly what Musk’s plans are for Twitter’s trust and security work, but it’s troubling that he’s talking about taking the company in a different direction.”
Before taking over the company, Musk often complained about what he saw as the platform’s lack of “free speech,” but has only defined his vision of free speech as “that which complies with the law,” he wrote in a tweet in April.
“I don’t think he has anything to do with free speech. “I think he’s about ‘free speech I like,'” Citron said.
Twitter has always had major flaws in how it handles privacy, security, and user trust issues. It has been widely criticized as being reluctant to address issues of hate speech and trolling, while introducing features that no one asked for. The council itself accused Twitter of not listening or being responsive enough in 2019, in a letter to Dorsey obtained by Wired. But even with existing issues, disbanding a group that has done years of safe work at a critical moment in the platform’s history would be a mistake, council members say.
“It would be a shame to see the work and passion of this global group disintegrate, and I hope there is a way to continue working with Twitter under new leadership,” Holmes said.
“If Twitter dissolves the Council, I worry that it could signal a cutback by Twitter, in terms of seeking outside expertise, and a decision to de-prioritize core trust and security work,” Llansó said. “Twitter should have a process for engaging outside experts and perspectives to better inform its work.”