With just weeks remaining before the 2020 U.S. elections, Twitter is again making changes to how it flags tweets it determines include misinformation.
But the company will soon handle those tweets differently, requiring users to click through a pop-up that warns them of misinformation before they can see it. Starting next week, any tweet from a U.S. political figure or U.S.-based account with more than 100,000 followers that includes misinformation about the election or voting, or tweets that “obtain significant engagement” will come with the warning. Users will also only be able to quote tweet it, not like it or retweet it.
“We expect this will further reduce the visibility of misleading information, and will encourage people to reconsider if they want to amplify these Tweets,” Vijaya Gadde, Twitter’s legal, policy, trust and safety lead, and Kayvon Beykpour, Twitter’s product lead, wrote in a blog post.
Most notably in the U.S., a similar-looking feature was used on President Trump’s tweet encouraging violence against protesters in Minneapolis in June.
The platform also introduced temporary measures, starting on Oct. 20 and lasting at least until the conclusion of the U.S. elections, to keep misinformation from its feeds. For instance, any time a user tries to retweet something on the platform, they will be prompted to quote tweet instead and add their own commentary. Twitter acknowledged that it could add “some extra friction” so harmful information doesn’t spread as rapidly.
“We hope it will encourage everyone to not only consider why they are amplifying a tweet, but also increase the likelihood that people add their own thoughts, reactions and perspectives to the conversation,” Gadde and Beykpour said. However, there’s a workaround: The quote tweet will still go through even if users don’t write anything in the prompt, effectively looking like a traditional retweet in users’ feeds.
The company will also stop pushing tweets into feeds unless they are retweeted by accounts users follow. Twitter users often see tweets in their feeds that are “liked by” and “followed by” people they follow.
Additionally, any trending information in the “For You” tab on Twitter will have a label with additional context from Twitter’s team. “This will help people more quickly gain an informed understanding of the high volume public conversation in the US and also help reduce the potential for misleading information to spread,” Gadde and Beykpour wrote. The company only began adding context to trending topics in September.
Facebook and Google also made temporary changes to their platform recently given the expectation that, due to the Covid-19 pandemic, it may take longer to finalize election results in the U.S. this November. Google and Facebook each banned election-related ads across its ad networks after the polls close on Nov. 3 and did not specify a date when that will end. Each company said publicly, however, that they expect that stoppage to last for one week.
Twitter notably banned political ads from its platform last fall saying that “political message reach should be earned, not bought.”