Facebook has a midterm strategy. Trump won’t be part of it.

“We’re going to stick to that timeline,” Nick Clegg, president of global affairs at Meta, Facebook’s parent company, said in an interview. Facebook blocked Trump after posts the company said violated its policy of inciting violence during the deadly riots at the Capitol on Jan. 6, 2021. The company later set a date of Jan. 7, 2023, for a decision on whether to reinstate him.

If Trump announces he’s running for president in 2024, it could increase outside pressure on Facebook to make a call sooner. Many Republicans have already argued that the company is unfairly silencing Trump on a platform used by millions of Americans, and potential Trump opponents face no similar restrictions. Meanwhile, some of Trump’s critics have called for a permanent ban.

The debate over how to handle a potential Trump candidacy is also indicative of new political battles ahead for social media platforms both in November and in 2024, as they try to avoid a repeat of the misinformation that plagued the 2020 vote and helped fuel the violence. in its consequences.

Clegg’s remarks came as Facebook released a plan to address advertising and misinformation in the medium term – an approach largely in line with its handling of the 2020 election.

It was the latest in a series of announcements from social media giants about their preparations for the fall election. Less than three months before the midterms, Twitter last week announced that it is starting to tag false information about voting — which it last set before the 2020 election — and Google updated its algorithms to prioritize search results from authoritative sources.

Meta and other social media companies have come under intense scrutiny for their role in spreading misinformation and disinformation leading up to the 2016 presidential election, when Russian-linked accounts bought $150,000 worth of ads on Facebook alone to influence the election results. As a result, Google’s Facebook, Twitter and YouTube have instituted new election-related disinformation policies — revised in the 2018 and 2020 election cycles — to fact-check and flag errors related to voting and election results.

These new policies were put to the test during the attack on the US Capitol on January 6, 2021. After numerous inflammatory posts that day, all three platforms blocked Trump for violating their policies against incitement to violence. Twitter permanently banned Trump, and YouTube said it would indefinitely block his account.

Without access to his typical megaphones, Trump launched his own social media network, Truth Social, although he failed to amass the following he once had on Facebook.

Facebook’s approach to the midterms will be familiar to anyone who used the site in 2020. As it did that year, the company will block new political, election and issue-based ads during the final week of the midterm campaign. But unlike in 2020, Clegg said the company will not allow any modification of ads or how they are targeted in the final week.

The company plans to lift the restriction the day after the election. This differs from the 2020 election, when Facebook did not accept new political, electoral or broadcast ads until March 4, 2021 (except for those in the Georgia Senate runoff) to prevent confusion and abuse after the presidential election and the January 6 uprising. Clegg said Facebook is not planning to extend the ad ban this time, but “if circumstances change, then we have to change our position as well, and obviously we reserve the ability and the right to do that.”

Also, as in 2020, the company will remove misinformation about voting — including posts about incorrect voting dates, times and locations, as well as falsehoods about who can vote and calls for violence related to voting, registration or the election result. It is working with 10 fact-checking partners in the US to address viral misinformation, including five that cover Spanish-language content. This marks an increase from just three Spanish-language groups in 2020 and appears to be an acknowledgment that fake content is also spreading in non-English languages ​​on the platform.

“I think there has rightly been a lot of scrutiny about how we handle viral information in Spanish as well as in English,” Clegg said.

Clegg, a former deputy prime minister of the United Kingdom, described Facebook’s preparation for the midterm elections as a world apart from 2016, when Facebook and other social media companies came under fire for allowing Kremlin-linked trolls to abuse their platforms. He said the company’s “state of vigilance” is “much, much higher than what we set the last time there were midterms, in 2018. But I think it’s appropriate given the circumstances as they’ve changed since then .”

“Is it perfect? Is it infallible?” Clegg asked. “Politics change all the time, the way people campaign change all the time. … My crystal ball is no clearer than yours about how things will play out. But in terms of policies, commitment, resources, headcount, ingenuity, I think we’re just… I’d go as far as to say we’re fundamentally a different company than we were in 2016.”

Trump has maintained his lie that the 2020 election is stolen, which has been an invigorating factor in Republican primaries across the country.

Asked if he considered the former president more of a risk to public safety than the company at the time the ban was passed, Clegg said: “Look, I work for an engineering company. We are an engineering company. We are not going to start giving a running commentary on the politics of the United States.”

Regarding Trump’s ban, he said the company “will look at the situation as best we can understand it” but that “forcing Silicon Valley companies to provide constant commentary on political developments in the meantime is not going to help really in the light of this decision when we have to make it.”

Related Posts

Leave a Reply

Your email address will not be published. Required fields are marked *