Election misinformation is spreading with and without social media’s help

Our mission to help you navigate the new normal is fueled by subscribers. To enjoy unlimited access to our journalism, subscribe today. In anticipation of trouble during the final stages of the election, Facebook, Twitter, and YouTube added new labels that warn users about misinformation. In some cases, they also temporarily […]

Our mission to help you navigate the new normal is fueled by subscribers. To enjoy unlimited access to our journalism, subscribe today.

In anticipation of trouble during the final stages of the election, Facebook, Twitter, and YouTube added new labels that warn users about misinformation. In some cases, they also temporarily tightened rules for what users can post.

But when President Trump made a last-minute speech on Thursday, two cable networks aired the briefing in full, letting him spread unsupported claims of voter fraud and election theft.

“I think ultimately it is a little bit of a double standard here,” said Angelo Carusone, president of the left-leaning misinformation watchdog Media Matters. “If [Trump] said that online everyone would be criticizing Facebook and Twitter for carrying it live.” 

ABC, CBS, and NBC cut away during the speech so that their anchors could explain that the President’s claims were unsubstantiated. But CNN and Fox aired the nearly 17-minute statement uninterrupted. CNN displayed explanations at the bottom of the screen clarifying that Trump’s statements were false. Fox did not.

CNN anchor Jake Tapper quickly rebuked the speech, calling it “shockingly disappointing.” And in a segment following Trump’s address, fellow CNN anchor Anderson Cooper compared Trump to “an obese turtle on his back flailing in the sun realizing his time his over.”

Carusone took issue with CNN’s decision to broadcast the President’s full statement along with Cooper’s “insult” to Trump, saying, “If that’s the best commentary you have after that wraps up, then you shouldn’t have aired it.”

Instead Carusone said that the two channels could have added a short delay like how broadcasters do during live entertainment such as the Super Bowl. If something harmful airs, broadcasters can pull the plug. 

Erin Simpson, associate director of tech policy at nonpartisan public policy organization Center for American Progress, said she’s seen American media adjust to handling live misinformation from government officials over the past few years. That became especially apparent when some media stopped broadcasting the White House coronavirus briefings after some members started to veer off topic and spread misinformation. 

Overall, Simpson felt most television news handled Trump’s speech well, although she was concerned that Fox didn’t challenge the President’s claims. That is especially concerning if Biden wins the election and Trump, during his final weeks in office, is left to speak unchallenged, she added.

“It will be extremely damaging if they let Trump use their platform to spread hate or delegitimize the election,” Simpson said. 

Of course, social media had its own problems controlling misinformation over the past few days. Twitter obscured several of Trump’s tweets with a warning about misinformation that users must click through. It also stopped letting users retweet such posts without including their own commentary to contextualize it.

But Twitter has been slow to react to other tweets that included misinformation, allowing them to amass thousands of likes and retweets before they were labeled. For example, Trump’s son, Eric, on Wednesday tweeted that his father had won Pennsylvania even though the votes there hadn’t yet been fully counted and no major news outlet had declared a winner. The post was retweeted more than 30,000 times before Twitter finally labeled it.

At the same time, Facebook labeled several posts containing misinformation with a relatively generic message that suggests the vote counting was ongoing and providing a link to more authoritative information. Facebook neglected to spell out whether post contained misinformation or misleading claims. The problematic posts were widely surfaced to users until Thursday when the company started limiting their distribution. 

Facebook also let Trump ally Stephen Bannon livestream his online show War Room a day after he suggested on his show that Dr. Anthony Fauci, the nation’s top infectious disease expert, should be beheaded. Facebook removed that video from its service but let Bannon continue to broadcast, in contrast to Twitter, which kicked Bannon off its service.

Simpson said she’s most concerned about YouTube, which has avoided much of the public backlash against social media but arguably features just as much problematic content. For videos related to the election, YouTube tacks on a label on the bottom that says election results may not be final and sends users to Google for more information. On Wednesday, CNBC reported that the company failed to remove a video from conservative news service One American News Network that claimed Trump had won the election and that also spread unsupported claims of voter fraud.

The bigger issue is that though the election will conclude, political division and misinformation won’t. Jessica Gonzalez, co-CEO of the advocacy nonprofit Free Press, said that she’s concerned about the amount of misinformation online, especially what comes directly from politicians.  

 “We’re going to have to keep educating, fighting, investing in journalism and journalists of color,” she said. “We’ve got a lot of work to do.”

More must-read tech coverage from Fortune:

Balmon Hyper

Next Post

What can its bond issues show us about the Rolls-Royce share price?

Sat Nov 7 , 2020
Bonds and shares are often seen as opposites by many investors. This is not exactly true, though. While debt and equity are to a certain extent opposite, as investments they are more like two sides of the same coin. This is why I think its latest bond issues have a […]