Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Twitter's New Rules Aim To Prevent Confusion Around The 2020 Vote

Twitter says it will crack down on attempts to undermine faith in the November election or incite unrest.
Olivier Douliery
/
AFP via Getty Images
Twitter says it will crack down on attempts to undermine faith in the November election or incite unrest.

Twitter is putting new restrictions on election-related content, including labeling or removing posts that claim victory before results are official or attempt to disrupt the peaceful transfer of power.

"We will not permit our service to be abused around civic processes, most importantly elections," the company said in a blog post Thursday.

The moves come as pressure is ramping up on social media platforms to halt the spread of misinformation and prevent interference in November's presidential election, including from American politicians such as President Trump.

Twitter said it would label or remove "false or misleading information intended to undermine public confidence in an election or other civic process."

That includes attempts to cause confusion about laws and regulations, unverified claims about election rigging and ballot tampering, and interfering with results by "inciting unlawful conduct to prevent a peaceful transfer of power or orderly succession." The new rules will take effect on Sept. 17.

"The goal is to further protect against content that could suppress the vote and help stop the spread of harmful misinformation that could compromise the integrity of an election or other civic process," the company wrote.

It said any attempt to abuse Twitter around the election "both foreign and domestic — will be met with strict enforcement of our rules, which are applied equally and judiciously for everyone."

A Twitter spokesman said the decision whether to label a post or remove it entirely will depend on how specific the false claim is and whether it is likely to cause real-world harm. Examples of such harm include encouraging people not to vote, to vote twice or on the wrong day, or not to participate in the census.

Labeled posts will indicate that the claims are disputed and direct users to sources of verified information. Twitter also reduces their reach on its platform — meaning people are less likely to see labeled posts in their feeds.

Twitter's announcement comes a week after Facebook said it was tightening its policies, including deleting claims that people will get COVID-19 if they vote and labeling posts attempting to "delegitimize" the election outcome.

Facebook said it will also not accept any new political ads in the week before the election. Twitter banned political advertising entirely last year.

Executives at both Twitter and Facebook said they are worried about the election and, in particular, the potential for confusion, disruption and even violence if the outcome is not immediately clear, as many experts said is likely because of increased mail-in voting during the pandemic.

"There's a lot of opportunities to introduce fear, uncertainty and doubt at strategic moments," Yoel Roth, Twitter's head of site integrity, said last month during a panel discussionon election security.

Both companies have cracked down on foreign interference, including last week removing a campaign linked to Russian state actors who had spread disinformation during the 2016 election.

This time around, the social media companies are also grappling with the spread of domestic misinformation, including from Trump. Twitter and Facebook have labeled and in a few cases removed recent posts in which the president made false claims about the coronavirus and about voting by mail.

Editor's note:Facebook is among NPR's financial supporters.

Copyright 2021 NPR. To see more, visit https://www.npr.org.

Shannon Bond is a business correspondent at NPR, covering technology and how Silicon Valley's biggest companies are transforming how we live, work and communicate.