Posted on

Twitter, Facebook and YouTube Survived Election Day. More Tests Loom.

OAKLAND, Calif. — For months, Twitter, Facebook and YouTube prepared to clamp down on misinformation on Election Day.

On Tuesday, most of their plans went off without a hitch. The social platforms added labels to misleading posts by President Trump and notified their users that there was no immediate outcome to the presidential race. On television, news anchors even cited fact-checks similar to those made by Twitter and Facebook.

Then came Wednesday. With ballots still being counted and the absence of a clear result, the flow of misinformation shifted away from seeding doubts about the vote to false claims of victory. Twitter rapidly labeled several tweets by Mr. Trump over the course of the day as being misleading about the result of his race, and also did the same to tweets from others in his circle, such as Eric Trump and the White House press secretary, Kayleigh McEnany. And Facebook and YouTube used their home pages to show people accurate information about the election.

The actions reinforced how even a smooth performance on Election Day did not mean that the social media companies could relax, fighting a relentless flow of toxic content. In fact, the biggest tests for Facebook, Twitter and YouTube are still looming, misinformation researchers said, as false narratives may surge until a final result in the presidential race is certified.

“What we actually saw on Election Day from the companies is that they were extremely responsive and faster than they’ve ever been,” said Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab. But now, he said, misinformation was solely focused on the results and undermining them.

“You have a hyperfocused audience and a moment in time where there is a huge amount of uncertainty, and bad actors can use that opportunistically,” he said.

Twitter said it was continuing to monitor for misinformation. Facebook said, “Our work isn’t done — we’ll stay vigilant and promote reliable information on Facebook as votes continue to be counted.” YouTube said it also was on alert for “election-related content” in the coming days.

The companies had all braced for a chaotic Election Day, working to avoid a repeat of 2016, when their platforms were misused by Russians to spread divisive disinformation. In recent months, the companies had rolled out numerous anti-misinformation measures, including suspending or banning political ads, slowing down the flow of information and highlighting accurate information and context.

Image

On Tuesday, as Americans voted across the country, falsehoods about broken voting machines and biased poll workers popped up repeatedly. But the companies weren’t tested until Mr. Trump — with early results showing how tight the race was — posted on Twitter and Facebook just before 1 a.m. Eastern time to baselessly lash out at the electoral process.

“They are trying to STEAL the Election,” Mr. Trump posted on the sites, without being specific about who he meant.

Twitter moved quickly, hiding Mr. Trump’s inaccurate tweet behind a label that cautioned people that the claim was “disputed” and “might be misleading about an election or other civic process.” Twitter, which had started labeling Mr. Trump’s tweets for the first time in May, also restricted users’ ability to like and share the post.

On Wednesday morning, Twitter added more labels to posts from Mr. Trump. In one, he tweeted that his early leads in Democratic states “started to magically disappear.” In another message, Mr. Trump said unnamed people were working to make his lead in the battleground state of Pennsylvania “disappear.”

Twitter also applied other labels to posts that falsely asserted victory. One was added to a post by Ben Wikler, head of the Democratic Party of Wisconsin, in which he asserted prematurely that Joseph R. Biden Jr. had won the state. The Associated Press and other news outlets later called Wisconsin for Mr. Biden, though Mr. Trump called for a recount.

On Wednesday afternoon, Twitter also affixed context to tweets from Eric Trump, one of Mr. Trump’s sons, and Ms. McEnany when they preemptively claimed that Mr. Trump had won in Pennsylvania, even though the race there had not been called. The company also fact-checked other assertions from Mr. Trump claiming victory in several battleground states such as North Carolina and Georgia, where the race has not been called, and restricted his false statements about voter fraud from being shared.

“As votes are still being counted across the country, our teams continue to take enforcement action on tweets that prematurely declare victory or contain misleading information about the election broadly,” Twitter said.

Facebook took a more cautious approach. Mark Zuckerberg, its chief executive, has said he has no desire to fact-check the president or other political figures because he believes in free speech. Yet to prevent itself from being misused in the election, Facebook said it would couch premature claims of victory with a notification that the election had yet to be called for a candidate, if necessary.

Image

Unlike Twitter, Facebook did not restrict users from sharing or commenting on Mr. Trump’s posts. But it was the first time Facebook had used such labels, part of the company’s plan to add context to posts about the election. A spokesman said the company “planned and prepared for these scenarios and built the essential systems and tools.”

YouTube, which is not used regularly by Mr. Trump, faced fewer high-profile problems than Twitter and Facebook. All YouTube videos about election results included a label that said the election might not be over and linked to a Google page with results from The Associated Press.

But the site did encounter a problem early on Tuesday night when several YouTube channels, one with more than a million subscribers, said they were livestreaming election results. What the live streams actually showed was a graphic of a projection of an election outcome with Mr. Biden leading. They were also among the first results that appeared when users searched for election results.

After media reports pointed out the issue, YouTube removed the video streams, citing its policy prohibiting spam, deceptive practices and scams.

On Wednesday, One America News Network, a conservative cable news network with nearly a million subscribers on YouTube, also posted a video commentary to the site claiming that Mr. Trump had already won the election and that Democrats were “tossing Republican ballots, harvesting fake ballots and delaying results” to cause confusion. The video has been viewed more than 280,000 times.

Farshad Shadloo, a YouTube spokesman, said the video did not violate the company’s policy regarding misleading claims about voting. He said the video carried a label that the election results were not final. YouTube added that it had removed ads from the video because it did not allow creators to make money off content that undermined “confidence in elections with demonstrably false information.”

Alex Stamos, director of the Stanford Internet Observatory, said the tech companies still had a fight ahead against election misinformation, but were prepared for it.

“There will always be a long tail of disinformation, but it will become less impactful,” he said. “They are still working, for sure, and will try to maintain this staffing level and focus until the outcome is generally accepted.”

But Fadi Quran, campaign director at Avaaz, a progressive nonprofit that tracks misinformation, said Facebook, Twitter and YouTube needed to do more.

“Platforms need to quickly expand their efforts before the country is plunged into further chaos and confusion,” he said. “It is a democratic emergency.”

Posted on

Facebook Widens Ban on Political Ads as Alarm Rises Over Election

SAN FRANCISCO — Over the past few weeks, Mark Zuckerberg, Facebook’s chief executive, and his lieutenants have watched the presidential race with an increasing sense of alarm.

Executives have held meetings to discuss President Trump’s evasive comments about whether he would accept a peaceful transfer of power if he lost the election. They watched Mr. Trump tell the Proud Boys, a far-right group that has endorsed violence, to “stand back and stand by.” And they have had conversations with civil rights groups, who have privately told them that the company needs to do more because Election Day could erupt into chaos, Facebook employees said.

That has resulted in new actions. On Wednesday, Facebook said it would take more preventive measures to keep political candidates from using it to manipulate the election’s outcome and its aftermath. The company now plans to prohibit all political and issue-based advertising after the polls close on Nov. 3 for an undetermined length of time. And it said it would place notifications at the top of the News Feed notifying people that no winner had been decided until a victor was declared by news outlets.

“This is shaping up to be a very unique election,” Guy Rosen, vice president for integrity at Facebook, said in a call with reporters on Wednesday.

Facebook is doing more to safeguard its platform after introducing measures to reduce election misinformation and interference on its site just last month. At the time, Facebook said it planned to ban new political ads for a contained period — the week before Election Day — and would act swiftly against posts that tried to dissuade people from voting. Mr. Zuckerberg also said Facebook would not make any other changes until there was an official election result.

But the additional moves underscore the sense of emergency about the election, as the level of contentiousness has risen between Mr. Trump and his opponent, Joseph R. Biden Jr. On Tuesday, to help blunt further political turmoil, Facebook also said it would remove any group, page or Instagram account that openly identified with QAnon, the pro-Trump conspiracy movement.

For years, Facebook has been striving to avoid another 2016 election fiasco, when it was used by Russian operatives to spread disinformation and to destabilize the American electorate. Mr. Zuckerberg has since spent billions of dollars to hire new employees for the company’s “integrity” and security divisions, who identify and clamp down on interference. He has said the amount of money spent on securing Facebook exceeded its entire revenue of roughly $5.1 billion during its first year as a public company in 2012.

Image
Credit…Gabriella Demczuk for The New York Times

“We believe that we have done more than any other company over the past four years to help secure the integrity of elections,” Mr. Rosen said.

Yet how successful the efforts have been are questionable. The company continues to find and take down foreign interference campaigns, including three Russian disinformation networks as recently as two weeks ago.

Domestic misinformation has also mushroomed, as Facebook has said it will not police speech from politicians and other leading figures for truthfulness. Mr. Zuckerberg, who supports unfettered speech, has not wavered from that position as Mr. Trump has posted falsehoods and misleading comments on the site.

For next month’s election, Facebook has gamed out almost 80 scenarios — what technology and security workers call “red teaming” exercises — to figure out what could go wrong and to protect against the situations. It also updated its policies to outlaw certain types of statements and threats from elected officials, capped by last month’s sweeping set of changes.

But after weeks of Mr. Trump declining to say he would accept the election’s outcome, while also directing his supporters to “watch” the polls, Facebook decided to ramp up protective measures.

Asked why the company was acting now, Facebook executives said they were “continuing to evaluate and plan for different scenarios” with the election.

The open-ended ban on political advertising is especially significant, after Facebook resisted calls to remove the ads for months. Last month, the company had said it only would stop accepting new political ads in the week before Election Day, so existing political ads would continue circulating. New political ads could have resumed running after Election Day.

But Facebook lags other social media companies in banning political ads. Jack Dorsey, Twitter’s chief executive, banned all political ads from the service a year ago because, he said, they could rapidly spread misinformation and had “significant ramifications that today’s democratic infrastructure may not be prepared to handle.” Last month, Google said it, too, would ban all political and issue ads after Election Day.

Mr. Zuckerberg has said that ads give less well-known politicians the ability to promote themselves, and that eliminating those ads could hurt their chances at broadening their support base online.

Facebook also said it would rely on a mix of news outlets, including Reuters and The Associated Press, to determine whether a candidate had secured the presidency. Until those news organizations called the race, Facebook said, it would place notifications in the News Feed to say no candidate had won. That buttresses what the company had said it would do last month, when it announced that it would attach labels to posts redirecting users to Reuters if Mr. Trump or his supporters falsely claimed an early victory.

To tamp down on potential intimidation at ballot boxes, Facebook also plans to remove posts that call for people to engage in poll watching “when those calls use militarized language or suggest that the goal is to intimidate, exert control, or display power over election officials or voters.”

Mr. Trump and others have talked about watching polls in recent weeks. In a debate with Mr. Biden last week, Mr. Trump urged his supporters to “go into the polls and watch very carefully” on Election Day. His son, Donald Trump Jr., said he wanted to see an “army for Trump” swarming the polls, raising concerns about the threat of violence at the ballot box.

Facebook, which has been criticized for unevenly removing posts and inconsistently enforcing its policies against toxic content, said it had already taken down many posts where people were trying to interfere with the vote. Between March and September, it removed more than 120,000 posts from Facebook and Instagram in the United States because the messages violated its voter interference policies.

Some researchers said Facebook was still not going far enough.

“If we are to believe that Facebook will faithfully enforce its own new policies, then they should take down the posts of the powerful users — including the president’s son — who have already called for violent intimidation around voting and on Election Day,” said Shannon McGregor, a senior researcher with the Center for Information, Technology, and Public Life at the University of North Carolina, Chapel Hill,

The company said that it wouldn’t shy away from eliminating more posts as the election approaches. On Tuesday, it took down a post from Mr. Trump where he falsely claimed the flu was more deadly than the coronavirus.

“I want to underscore that we remove this content regardless of who posts it,” said Monica Bickert, head of global policy management at Facebook. “That includes the president.”