Facebook, Twitter on 'high alert' for election misinformation
The social media giants assured the American electorate they're ready for bad actors
Social media companies are on "high alert" for disinformation both before and after November's presidential election.
Leaders of the industry told NPR Thursday that monitoring disinformation was a top priority.
Twitter's Head of Integrity, Yoel Roth, has seen a lot of the ways bad actors can abuse his platform and told NPR Thursday that the social media platform has gamed out a response.
FACEBOOK OVERSIGHT BOARD SET TO LAUNCH JUST BEFORE ELECTION
"We really undertook a process to try and predict what the worst-case scenarios were, based on what we had seen previously in 2016, 2018 and in elections around the world, as well as some of the things that we thought were likely to happen in the United States this time around," said Roth.
With nothing off-limits and the U.S.Postal Service warning about delayed ballot delivery, tech giants like Facebook guess that uncertainty could create an information vacuum where conspiracy theories may thrive.
"We are on high alert before the election and after the election," Facebook Chief Operating Officer Sheryl Sandberg told NPR's All Things Considered earlier this week. "We are worried about misinformation. We are worried about people claiming election results [prematurely]."
Twitter and Facebook are certainly not strangers to dealing with false posts regarding the 2020 presidential election, and Facebook and Twitter can serve as dangerous echo chambers for the easily persuaded.
While 2020 would not be the first year Americans have had to wait to find out who won the general election -- in 2000, it took 36 days for a winner to emerge -- experts warn that this year is different.
"People couldn't be mobilized on social media in this way" in years past, disinformation expert Clint Watts told NPR.
Twenty years ago, the most advanced phone was the Nokia 3310. The original iPhone wasn't even released until 2007 -- six years after September 11 terror attacks changed U.S. national security for good.
Facebook wasn't founded until 2004 and Twitter not until 2006. In the aftermath of claims that they were exploited by Russian agents in 2016, the firms have created new rules against posts that undermine election results and Facebook announced it will reject ads that claim premature victory.
Google, the owner of YouTube, said it won't allow any political advertising at all after polls close.
Twitter and Facebook also pointed out that they are using both artificial intelligence and human review to stop harmful content from spreading.
However, as NPR notes, stopping messages from going viral runs counter to the way the platforms work and many skeptics worry more about the enforcement of policies than the policies themselves.
In response, Facebook has created its own Independent Oversight Board, slated to launch later this month.
Earlier in the week, 2020 Democratic presidential nominee Joe Biden called Facebook "the nation's foremost propagator of disinformation about the voting process" because of its decision to label, rather than remove, President Trump's posts attacking mail-in ballots.
CLICK HERE TO READ MORE ON FOX BUSINESS
Facebook insisted it applies its policies fairly.
"We are taking a lot of things down," Sandberg added. "We obviously have to let candidates speak."