Posted on

Roiled by Election, Facebook Struggles to Balance Civility and Growth

SAN FRANCISCO — In the tense days after the presidential election, a team of Facebook employees presented the chief executive, Mark Zuckerberg, with an alarming finding: Election-related misinformation was going viral on the site.President Trump was already casting the election as rigged, and stories from right-wing media outlets with false and misleading claims about discarded ballots, miscounted votes and skewed tallies were among the most popular news stories on the platform.In response, the employees proposed an emergency change to the site’s news feed algorithm, which helps determine what more than two billion people see every day. It involved …

Read More

Posted on

How Misinformation ‘Superspreaders’ Seed False Election Theories

On the morning of Nov. 5, Eric Trump, one of the president’s sons, asked his Facebook followers to report cases of voter fraud with the hashtag, Stop the Steal. His post was shared over 5,000 times.

Image

By late afternoon, the conservative media personalities Diamond and Silk had shared the hashtag along with a video claiming voter fraud in Pennsylvania. Their post was shared over 3,800 times.

Image

That night, the conservative activist Brandon Straka asked people to protest in Michigan under the banner #StoptheSteal. His post was shared more than 3,700 times.

Image

Over the next week, the phrase “Stop the Steal” was used to promote dozens of rallies that spread false voter fraud claims about the U.S. presidential elections.

New research from Avaaz, a global human rights group, the Elections Integrity Partnership and The New York Times shows how a small group of people — mostly right-wing personalities with outsized influence on social media — helped spread the false voter-fraud narrative that led to those rallies.

That group, like the guests of a large wedding held during the pandemic, were “superspreaders” of misinformation around voter fraud, seeding falsehoods that include the claims that dead people voted, voting machines had technical glitches, and mail-in ballots were not correctly counted.

“Because of how Facebook’s algorithm functions, these superspreaders are capable of priming a discourse,” said Fadi Quran, a director at Avaaz. “There is often this assumption that misinformation or rumors just catch on. These superspreaders show that there is an intentional effort to redefine the public narrative.”

Across Facebook, there were roughly 3.5 million interactions — including likes, comments and shares — on public posts referencing “Stop the Steal” during the week of Nov. 3, according to the research. Of those, the profiles of Eric Trump, Diamond and Silk and Mr. Straka accounted for a disproportionate share — roughly 6 percent, or 200,000, of those interactions.

While the group’s impact was notable, it did not come close to the spread of misinformation promoted by President Trump since then. Of the 20 most-engaged Facebook posts over the last week containing the word “election,” all were from Mr. Trump, according to Crowdtangle, a Facebook-owned analytics tool. All of those claims were found to be false or misleading by independent fact checkers.

The baseless election fraud claims have been used by the president and his supporters to challenge the vote in a number of states. Reports that malfunctioning voting machines, intentionally miscounted mail-in votes and other irregularities affecting the vote were investigated by election officials and journalists who found no evidence of widespread voter fraud.

The voter fraud claims have continued to gather steam in recent weeks, thanks in large part to prominent accounts. A look at a four-week period starting in mid-October shows that President Trump and the top 25 superspreaders of voter fraud misinformation accounted for 28.6 percent of the interactions people had with that content, according to an analysis by Avaaz.

“What we see these people doing is kind of like setting a fire down with fuel, it is designed to quickly create a blaze,” Mr. Quran said. “These actors have built enough power they ensure this misinformation reaches millions of Americans.”

In order to find the superspreaders, Avaaz compiled a list of 95,546 Facebook posts that included narratives about voter fraud. Those posts were liked, shared or commented on nearly 60 million times by people on Facebook.

Avaaz found that just 33 of the 95,546 posts were responsible for over 13 million of those interactions. Those 33 posts had created a narrative that would go on to shape what millions of people thought about the legitimacy of the U.S. elections.

A spokesman for Facebook said the company had added labels to posts that misrepresented the election process and was directing people to a voting information center.

“We’re taking every opportunity to connect people to reliable information about the election and how votes are being counted,” said Kevin McAlister, a Facebook spokesman. The company has not commented on why accounts that repeatedly share misinformation, such as Mr. Straka’s and Diamond and Silk’s, have not been penalized. Facebook has previously said that President Trump, along with other elected officials, is granted a special status and is not fact-checked.

Many of the superspreader accounts had millions of interactions on their Facebook posts over the last month, and have enjoyed continued growth. The accounts were active on Twitter as well as Facebook, and increasingly spread the same misinformation on new social media sites like Parler, MeWe and Gab.

Dan Bongino, a right-wing commentator with a following of nearly four million people on Facebook, had over 7.7 million interactions on Facebook the week of Nov. 3. Mark Levin, a right-wing radio host, had nearly four million interactions, and Diamond and Silk had 2.5 million. A review of their pages by The Times shows that a majority of their posts have focused on the recent elections, and voter fraud narratives around them.

None of the superspreaders identified in this article responded to requests for comment.

One of the most prominent false claims promoted by the superspreaders was that Dominion voting software deleted votes for Mr. Trump, or somehow changed vote tallies in several swing states. Election officials have found no evidence that the machines malfunctioned, but posts about the machines have been widely shared by Mr. Trump and his supporters.

Over the last week, just seven posts from the top 25 superspreaders of the Dominion voter fraud claim accounted for 13 percent of the total interactions on Facebook about the claim.

Many of those same accounts were also top superspreaders of the Dominion claim, and other voter fraud theories, on Twitter. The accounts of President Trump, his son Eric, Mr. Straka and Mr. Levin were all among the top 20 accounts that spread misinformation about voter fraud on Twitter, according to Ian Kennedy, a researcher at the University of Washington who works with the Elections Integrity Partnership.

Mr. Trump had by far the largest influence on Twitter. A single tweet by the president accusing Dominion voting systems of deleting 2.7 million votes in his favor was shared over 185,000 times, and liked over 600,000 times.

Like the other false claims about voter fraud, Mr. Trump’s tweet included a label by Twitter that he was sharing information that was not accurate.

Twitter, like Facebook, has said that those labels help prevent false claims from being shared and direct people toward more authoritative sources of information.

Earlier this week, BuzzFeed News reported that Facebook employees questioned whether the labels were effective. Within the company, employees have sought out their own data on how well national newspapers performed during the elections, according to one Facebook employee.

On the #StoptheSteal hashtag, they found that both The New York Times and The Washington Post were among the top 25 pages with interactions on that hashtag — mainly from readers sharing articles and using the hashtag in those posts.

Combined, the two publications had approximately 44,000 interactions on Facebook under that hashtag. By comparison, Mr. Straka, the conservative activist who shared the call to action on voter fraud, got three times that number of interactions sharing material under the same hashtag on his own Facebook account.

Jacob Silver contributed reporting.

Posted on

‘Stop the Steal’ Facebook Group Is Taken Down

OAKLAND, Calif. — The first post in the new Facebook group that was started on Wednesday was innocuous enough. “Welcome” to Stop the Steal, it said.

But an hour later, the group uploaded a minute-long video to its Facebook page with a pointed message. The grainy footage showed a crowd outside a polling station in Detroit, shouting and chanting “stop the count.” Below the video, which was quickly shared nearly 2,000 times, members of the group commented “Biden is stealing the vote” and “this is unfair.”

The viral video helped turn the Stop the Steal Facebook group into one of the fastest-growing groups in Facebook’s history. By Thursday morning, less than 22 hours after it was started, it had amassed more than 320,000 users — at one point gaining 100 new members every 10 seconds. As its momentum grew, it caught the attention of Facebook executives, who shut down the group hours later for trying to incite violence.

Even so, the Stop the Steal Facebook group had done its work. In its brief life span, it became a hub for people to falsely claim that the ballot count for the presidential election was being manipulated against President Trump. New photographs, videos and testimonials asserting voter fraud were posted to the group every few minutes. From there, they traveled onto Twitter, YouTube and right-wing sites that cited the unsubstantiated and inaccurate posts as evidence of an illegitimate voting process.

Image

Stop the Steal’s rapid rise and amplifying effects also showed how Facebook groups are a powerful tool for seeding and accelerating online movements, including those filled with misinformation. Facebook groups, which are public and can be joined by anyone with a Facebook account, have long been the nerve centers for fringe movements such as QAnon and anti-vaccination activists. And while Stop the Steal has been deleted, other Facebook groups promoting falsehoods about voter fraud have popped up.

“Facebook groups are powerful infrastructure for organizing,” said Renee DiResta, a disinformation researcher at the Stanford Internet Observatory. She added that the Stop the Steal Facebook group helped people coalesce around a baseless belief that the election was being unlawfully taken from Mr. Trump.

Tom Reynolds, a Facebook spokesman, said the social network removed the Stop the Steal group as part of the “exceptional measures” it was taking on the election. “The group was organized around the delegitimization of the election process, and we saw worrying calls for violence from some members of the group,” he said.

Stop the Steal was born on Facebook on Wednesday at 3 p.m. Eastern time as the outcome of the presidential election remained uncertain. About 12 hours earlier, as the vote counts showed a tight race between Mr. Trump and Joseph R. Biden Jr., Mr. Trump had posted without evidence on Facebook and Twitter that “They are trying to STEAL the Election.” Mr. Trump has since repeated that assertion openly in remarks from the White House and on social media.

The idea of a stolen election quickly spread among Mr. Trump’s supporters, including to a Facebook user named Kylie Jane Kremer. Ms. Kremer, 30, a former Tea Party activist, runs a conservative nonprofit called Women for America First. She created the Stop the Steal Facebook group.

In an interview on Thursday from a protest in Atlanta, Ms. Kremer said she had started the Facebook group after speaking with conservative activists and seeing social media posts about voter fraud. She said she wanted to help organize people across the United States on the issue and centralize discussions over protests and rallies.

“I knew other people saw this the same as I did, that there were people out there trying to steal the election from the rightful person,” Ms. Kremer said, referring to Mr. Trump. “I wanted us to be able to organize to take action.”

Once the Facebook group was live, she said, it took off. Hundreds of members joined within the first hour. Then people began sharing videos — including the one showing people chanting “stop the count” in Detroit — and photographs, which were quickly shared to other Facebook pages and groups.

”It was like lightning in a bottle,” Ms. Kremer said. “The group grew so fast we were struggling to keep up with the people trying to post.”

Many of the posts shared anecdotal stories claiming voter fraud or intimidation against Mr. Trump’s supporters. One post asserted that poll workers counting the ballots were wearing masks with the Biden campaign’s logo, while another said that Mr. Trump’s supporters were purposefully given faulty ballots that could not be read by machines.

Others posted about violence. One member of the Facebook group wrote on Wednesday, “This is going to take more than talk to fix.” Underneath that post, another member responded with emojis of explosions.

On Thursday morning, the Stop the Steal Facebook group’s growth skyrocketed further, according to data from CrowdTangle, a Facebook-owned social media analytics tool.

That was when right-wing figures such as Jack Posobiec, a pro-Trump activist, and Amy Kremer, Ms. Kremer’s mother and a founder of a group called Women for Trump, began posting about the Facebook group on Twitter. Ali Alexander, a political operative who previously went by the name Ali Akbar, also tweeted dozens of times about the Stop the Steal movement to his 140,000 Twitter followers.

Their messages, which were shared thousands of times, were a rallying cry for people to join the Stop the Steal Facebook group and take action in local protests against voter fraud.

“In just it’s first couple hours, more than 100,000 people joined the Women for America First, Stop the Steal Facebook Group,” wrote Mr. Posobiec. In comments below his post, many people cheered the Facebook group’s popularity.

The tweets helped send more people to Stop the Steal. Interactions with the Facebook group soared to 36 posts a minute on Thursday morning, up from roughly one post a minute, according to CrowdTangle data.

Mr. Posobiec, Mr. Alexander and Amy Kremer did not immediately respond to requests for comment.

At Facebook, executives were notified of the group by Facebook moderators as they began flagging posts for potential calls for violence and protests to disrupt the vote. The company also received calls from journalists about the group and its explosive growth. By midmorning, executives were discussing whether they should remove Stop the Steal, said one employee involved in the discussions who was not authorized to speak publicly.

Facebook took down the group on Thursday at 2 p.m. Eastern.

Ms. Kremer said that she was angry that Facebook had removed her group and that she was in discussions with the company to reinstate it. She accused Facebook, along with other social media companies, of censoring the Stop the Steal movement.

“Facebook had other options,” she said. “They were flagging our posts and we could have worked with them. But this is what they do, they censor.”

Still, Ms. Kremer said that before the group was taken down, its members had successfully organized events in dozens of cities. She has set up another website about voter fraud and was now directing people to it, she said.

On Facebook, dozens of new Stop the Steal groups have been created since the company removed Ms. Kremer’s group. One had nearly 10,000 members. Another had just over 2,000.

Posted on

Facebook, Alarmed by Discord Over Vote Count, Is Said to Be Taking Action

SAN FRANCISCO — Facebook is planning to enact new measures to make it more difficult for election misinformation to spread virally across its platform, two people with knowledge of the matter said Thursday, as the outcome of the presidential race remained uncertain.

Facebook plans to add more “friction” — such as an additional click or two — before people can share posts and other content, said the people, who requested anonymity because they were not authorized to speak publicly. The company will also demote content on the News Feed if it contains election-related misinformation, making it less visible, and limit the distribution of election-related Facebook Live streams, the people said.

The measures, which could be rolled out as soon as Thursday, are a response to heightened strife and social discord on Facebook after the election on Tuesday, these people said. They said there had been more activity by users and Facebook groups to coordinate potentially violent actions over issues such as voter fraud. President Trump has falsely claimed on social media and in remarks from the White House over the past few days that the election was being “stolen” from him, even while a final result remained unclear.

The changes would be some of the most significant steps taken by Facebook, which has in the past tried to make sharing information as easy as possible so that it can increase engagement on its site. The moves would most likely be temporary, said the people with knowledge of them, and were designed to cool down angry Americans who are clashing on the network.

“As vote counting continues, we are seeing more reports of inaccurate claims about the election,” Facebook said in a statement. As a result, it said, it is “taking additional temporary steps.”

Facebook has been more proactive about clamping down on misinformation in recent months, even as its chief executive, Mark Zuckerberg, has said he does not want to be the arbiter of truth. The company prepared for months for the election. It ran through dozens of possibilities of what might happen on Nov. 3 and afterward in case political candidates or others tried to use the platform to delegitimize the results. The new measures were part of this planning, the people said.

This week, Facebook also suspended political advertising for an indefinite period, and introduced notifications at the top of the News Feed that said no winner had been called in the election.

Other social media companies have also made changes to slow down the way information flows on their networks and to highlight accurate information on their sites. Twitter, which Mr. Trump uses as a megaphone, had labeled 38 percent of his 29 tweets and retweets since early Tuesday with warnings that said he made misleading claims about the electoral process, according to a tally by The New York Times. Last month, Twitter also made it more arduous for people to retweet posts or share links to articles that users had not yet read.

TikTok said it was broadening its fact-checking partnerships for election disinformation, and was updating its policy to better represent what types of content are not allowed on the app. YouTube has used its home page to show people accurate information about the election.

Republicans and Democrats have long criticized Facebook and Mr. Zuckerberg for their stance on misinformation. Mr. Trump and other Republicans have accused Facebook of suppressing and censoring conservative speech, while Democrats have railed against the tech companies for not doing enough to clean up the glut of toxic online misinformation.

On Thursday, as part of a heightened campaign against election-related disinformation and calls to violence, the company also took down a new Facebook group, Stop the Steal, which had more than 320,000 members.

Facebook said that the group had been “organized around the delegitimization of the election process,” and that a number of the group’s members had originated calls for real-world violence.

Some of Facebook’s new measures have precedents. In June, the company added more context about the coronavirus and highlighted accurate information about Covid-19 from health authorities, to prevent falsehoods about it from spreading. WhatsApp and Messenger, two messaging apps owned by Facebook, have capped the number of times a message can be forwarded and have limited reshares of private messages to a maximum of five people.

Posted on

On Election Day, Facebook and Twitter Did Better by Making Their Products Worse

That gust of wind you felt coming from Silicon Valley on Wednesday morning was the social media industry’s tentative sigh of relief.

For the last four years, executives at Facebook, Twitter, YouTube and other social media companies have been obsessed with a single, overarching goal: to avoid being blamed for wrecking the 2020 U.S. election, as they were in 2016, when Russian trolls and disinformation peddlers ran roughshod over their defenses.

So they wrote new rules. They built new products and hired new people. They conducted elaborate tabletop drills to plan for every possible election outcome. And on Election Day, they charged huge, around-the-clock teams with batting down hoaxes and false claims.

So far, it appears those efforts have averted the worst. Despite the frantic (and utterly predictable) attempts from President Trump and his allies to undermine the legitimacy of the vote in the states where he is losing, there have been no major foreign interference campaigns unearthed this week, and Election Day itself was relatively quiet. Fake accounts and potentially dangerous groups have been taken down quickly, and Facebook and Twitter have been unusually proactive about slapping labels and warnings in front of premature claims of victory. (YouTube was a different story, as evidenced by the company’s slow, tepid response to a video that falsely claimed that Mr. Trump had won the election.)

The week is young, of course, and there’s still plenty of time for problems. Election-related disinformation is already trending up — some of it targeted at Latinos — and will only increase as votes are challenged in the courts, and conspiracy theorists capitalize on all the uncertainty to undermine confidence in the eventual results.

But the platforms’ worst fears haven’t yet materialized. That’s a good thing, and a credit to the employees of those companies who have been busy enforcing their rules.

At the same time, it’s worth examining how Twitter, Facebook and YouTube are averting election-related trouble, because it sheds light on the very real problems they still face.

For months, nearly every step these companies have taken to safeguard the election has involved slowing down, shutting off or otherwise hampering core parts of their products — in effect, defending democracy by making their apps worse.

They added friction to processes, like political ad-buying, that had previously been smooth and seamless. They brought in human experts to root out extremist groups and manually intervened to slow the spread of sketchy stories. They overrode their own algorithms to insert information from trusted experts into users’ feeds. And as results came in, they relied on the calls made by news organizations like The Associated Press, rather than trusting that their systems would naturally bring the truth to the surface.

Image

Nowhere was this shift more apparent than at Facebook, which for years envisioned itself as a kind of post-human communication platform. Mark Zuckerberg, the company’s chief executive, often spoke about his philosophy of “frictionless” design — making things as easy as possible for users. Other executives I talked to seemed to believe that ultimately, Facebook would become a kind of self-policing machine, with artificial intelligence doing most of the dirty work and humans intervening as little as possible.

But in the lead-up to the 2020 election, Facebook went in the opposite direction. It put in place a new, cumbersome approval process for political advertisers, and blocked new political ads in the period after Election Day. It throttled false claims, and put in place a “virality circuit-breaker” to give fact-checkers time to evaluate suspicious stories. And it temporarily shut off its recommendation algorithm for certain types of private groups, to lessen the possibility of violent unrest. (On Thursday, The New York Times reported that the company was taking other temporary measures to tamp down election-related misinformation, including adding more friction to the process of sharing posts.)

All of these changes may, in fact, make Facebook safer. But they also involve dialing back the very features that have powered the platform’s growth for years. It’s a telling act of self-awareness, as if Ferrari had realized that it could only stop its cars from crashing by replacing the engines with go-kart motors.

Image

YouTube didn’t act nearly as aggressively this week, but it has also changed its platform in revealing ways. Last year, it tweaked its vaunted recommendation algorithm to slow the spread of so-called borderline content. And it started promoting “authoritative sources” during breaking news events, to prevent cranks and conspiracy theorists from filling up the search results.

All of this raises the critical question of what, exactly, will happen once the election is over and the spotlight has swiveled away from Silicon Valley. Will the warning labels and circuit-breakers be retired? Will the troublesome algorithms get turned back on? Do we just revert to social media as normal?

Camille François, the chief innovation officer of Graphika, a firm that investigates disinformation on social media, said it was too early to say whether these companies’ precautions had worked as intended. But she conceded that this level of hypervigilance might not last.

“There were a lot of emergency processes put in place at the platforms,” she said. “The sustainability and the scalability of those processes is a fair question to ask.”

Mr. Pariser said that the platforms’ work to prevent election interference this year raised bigger questions about how they will respond to other threats.

“These platforms are used for really important conversations every day,” Mr. Pariser said. “If you do this for U.S. elections, why not other countries’ elections? Why not climate change? Why not acts of violence?”

These are the right questions to ask. The social media companies may have gotten through election night without a disaster. But as with the election itself, the real fights are still ahead.

Read More

Posted on

Twitter, Facebook and YouTube Survived Election Day. More Tests Loom.

OAKLAND, Calif. — For months, Twitter, Facebook and YouTube prepared to clamp down on misinformation on Election Day.

On Tuesday, most of their plans went off without a hitch. The social platforms added labels to misleading posts by President Trump and notified their users that there was no immediate outcome to the presidential race. On television, news anchors even cited fact-checks similar to those made by Twitter and Facebook.

Then came Wednesday. With ballots still being counted and the absence of a clear result, the flow of misinformation shifted away from seeding doubts about the vote to false claims of victory. Twitter rapidly labeled several tweets by Mr. Trump over the course of the day as being misleading about the result of his race, and also did the same to tweets from others in his circle, such as Eric Trump and the White House press secretary, Kayleigh McEnany. And Facebook and YouTube used their home pages to show people accurate information about the election.

The actions reinforced how even a smooth performance on Election Day did not mean that the social media companies could relax, fighting a relentless flow of toxic content. In fact, the biggest tests for Facebook, Twitter and YouTube are still looming, misinformation researchers said, as false narratives may surge until a final result in the presidential race is certified.

“What we actually saw on Election Day from the companies is that they were extremely responsive and faster than they’ve ever been,” said Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab. But now, he said, misinformation was solely focused on the results and undermining them.

“You have a hyperfocused audience and a moment in time where there is a huge amount of uncertainty, and bad actors can use that opportunistically,” he said.

Twitter said it was continuing to monitor for misinformation. Facebook said, “Our work isn’t done — we’ll stay vigilant and promote reliable information on Facebook as votes continue to be counted.” YouTube said it also was on alert for “election-related content” in the coming days.

The companies had all braced for a chaotic Election Day, working to avoid a repeat of 2016, when their platforms were misused by Russians to spread divisive disinformation. In recent months, the companies had rolled out numerous anti-misinformation measures, including suspending or banning political ads, slowing down the flow of information and highlighting accurate information and context.

Image

On Tuesday, as Americans voted across the country, falsehoods about broken voting machines and biased poll workers popped up repeatedly. But the companies weren’t tested until Mr. Trump — with early results showing how tight the race was — posted on Twitter and Facebook just before 1 a.m. Eastern time to baselessly lash out at the electoral process.

“They are trying to STEAL the Election,” Mr. Trump posted on the sites, without being specific about who he meant.

Twitter moved quickly, hiding Mr. Trump’s inaccurate tweet behind a label that cautioned people that the claim was “disputed” and “might be misleading about an election or other civic process.” Twitter, which had started labeling Mr. Trump’s tweets for the first time in May, also restricted users’ ability to like and share the post.

On Wednesday morning, Twitter added more labels to posts from Mr. Trump. In one, he tweeted that his early leads in Democratic states “started to magically disappear.” In another message, Mr. Trump said unnamed people were working to make his lead in the battleground state of Pennsylvania “disappear.”

Twitter also applied other labels to posts that falsely asserted victory. One was added to a post by Ben Wikler, head of the Democratic Party of Wisconsin, in which he asserted prematurely that Joseph R. Biden Jr. had won the state. The Associated Press and other news outlets later called Wisconsin for Mr. Biden, though Mr. Trump called for a recount.

On Wednesday afternoon, Twitter also affixed context to tweets from Eric Trump, one of Mr. Trump’s sons, and Ms. McEnany when they preemptively claimed that Mr. Trump had won in Pennsylvania, even though the race there had not been called. The company also fact-checked other assertions from Mr. Trump claiming victory in several battleground states such as North Carolina and Georgia, where the race has not been called, and restricted his false statements about voter fraud from being shared.

“As votes are still being counted across the country, our teams continue to take enforcement action on tweets that prematurely declare victory or contain misleading information about the election broadly,” Twitter said.

Facebook took a more cautious approach. Mark Zuckerberg, its chief executive, has said he has no desire to fact-check the president or other political figures because he believes in free speech. Yet to prevent itself from being misused in the election, Facebook said it would couch premature claims of victory with a notification that the election had yet to be called for a candidate, if necessary.

Image

Unlike Twitter, Facebook did not restrict users from sharing or commenting on Mr. Trump’s posts. But it was the first time Facebook had used such labels, part of the company’s plan to add context to posts about the election. A spokesman said the company “planned and prepared for these scenarios and built the essential systems and tools.”

YouTube, which is not used regularly by Mr. Trump, faced fewer high-profile problems than Twitter and Facebook. All YouTube videos about election results included a label that said the election might not be over and linked to a Google page with results from The Associated Press.

But the site did encounter a problem early on Tuesday night when several YouTube channels, one with more than a million subscribers, said they were livestreaming election results. What the live streams actually showed was a graphic of a projection of an election outcome with Mr. Biden leading. They were also among the first results that appeared when users searched for election results.

After media reports pointed out the issue, YouTube removed the video streams, citing its policy prohibiting spam, deceptive practices and scams.

On Wednesday, One America News Network, a conservative cable news network with nearly a million subscribers on YouTube, also posted a video commentary to the site claiming that Mr. Trump had already won the election and that Democrats were “tossing Republican ballots, harvesting fake ballots and delaying results” to cause confusion. The video has been viewed more than 280,000 times.

Farshad Shadloo, a YouTube spokesman, said the video did not violate the company’s policy regarding misleading claims about voting. He said the video carried a label that the election results were not final. YouTube added that it had removed ads from the video because it did not allow creators to make money off content that undermined “confidence in elections with demonstrably false information.”

Alex Stamos, director of the Stanford Internet Observatory, said the tech companies still had a fight ahead against election misinformation, but were prepared for it.

“There will always be a long tail of disinformation, but it will become less impactful,” he said. “They are still working, for sure, and will try to maintain this staffing level and focus until the outcome is generally accepted.”

But Fadi Quran, campaign director at Avaaz, a progressive nonprofit that tracks misinformation, said Facebook, Twitter and YouTube needed to do more.

“Platforms need to quickly expand their efforts before the country is plunged into further chaos and confusion,” he said. “It is a democratic emergency.”

Posted on

It’s the End of an Era for the Media, No Matter Who Wins the Election

There’s a media phenomenon the old-time blogger Mickey Kaus calls “overism”: articles in the week before the election whose premise is that even before the votes are counted, we know the winner — in this case, Joe Biden.

I plead guilty to writing a column with that tacit premise. I spent last week asking leading figures in media to indulge in the accursed practice of speculating about the consequences of an election that isn’t over yet. They all read the same polls as you do and think that President Trump will probably lose.

But many leaders in news and media have been holding their breaths for the election — and planning everything from retirements to significant shifts in strategy for the months to come, whoever wins. President Trump, after all, succeeded in making the old media great again, in part through his obsession with it. His riveting show allowed much of the television news business, in particular, to put off reckoning with the technological shifts — toward mobile devices and on-demand consumption —  that have changed all of our lives. But now, change is in the air across a news landscape that has revolved around the president.

And given the jittery pre-election timing, I’ll try to keep these items short so you can check Nate Silver’s Twitter feed in between reading them.

Before the 2016 election, Andrew Lack, then the head of NBC News, warned colleagues that MSNBC’s revenue would take a 30 percent hit if — when — Hillary Clinton was elected, two people familiar with the remark told me. (After the debacle in 2016, few in the media wanted to be quoted speculating about what happens after the election.)

Well, TV sure dodged that bullet! CNN’s chief, Jeff Zucker, later told his Los Angeles bureau that Mr. Trump had bought the declining business four more years, a person who was there recalled. (A spokesman for CNN said that Mr. Zucker would not have speculated on future ratings.) And it has been a profitable time for cable news, a record-breaking year for political books and, generally, a bonanza for the legacy media that live rent-free in the president’s head.

That may be ending. MSNBC and other outlets that thrived on resistance to Mr. Trump may see their audiences fade, said Ken Lerer, a veteran investor and adviser to old media and new, who also predicted that The New York Times would “cool off” as you, dear reader, find other things to do.

And the people who continue to pay attention to the news will stay online.

“The pandemic has advanced digital by four or five years and it will not go back to what it was,” Mr. Lerer said.

In corporate media, that means what Cesar Conde, the new chairman of the NBCUniversal News Group, has been calling an “omnichannel” strategy, as brands like MSNBC no longer see themselves primarily as television. For new outlets, it’s an opportunity to press their advantage of being native to this new world.

“Many media organizations have spent the past four years generally failing to adapt to a campaign, a president, a White House and an administration that is extremely online,” said Stacy-Marie Ishmael, the editorial director of the nonprofit Texas Tribune. “We are only, four years in, getting to grips with how to contend with rhetorical techniques, messaging and communications steeped in misinformation and propaganda.”

Keep up with Election 2020

Others predicted a deeper cultural shift — from Stephen Colbert’s biting satire back to the sillier Jimmy Fallon, from politics back to entertainment, whenever the studios can get production running again. But some veterans of the business of politics doubt that news coverage can really calm down — or that consumers can look away.

“If Biden is elected, conservatives will be energized, not retreating,” said Eric Nelson, the editorial director of Broadside Books, HarperCollins’s conservative imprint. “Trump will keep tweeting, and new scandals from his presidency will keep unfolding for well into 2022. By the time that all chaos and nonsense runs out, Trump could be running again for 2024.”

You aren’t the only one just barely hanging on until Election Day. Most of the top leaders of many name-brand American news institutions will probably be gone soon, too. The executive editor of The Los Angeles Times, Norm Pearlstine, is looking to recruit a successor by the end of the year, he told me. Martin Baron, the executive editor of The Washington Post, just bought a house out of town and two Posties said they expected him to depart next year. He hasn’t given notice, The Post’s spokeswoman, Kristine Coratti Kelly, said. And the executive editor of The New York Times, Dean Baquet, is on track to retire by the time he turns 66 in 2022, two Times executives told me, dampening speculation that he might stay longer.

Over in big TV, Mr. Zucker, of CNN, has signaled that he’s frustrated with WarnerMedia, and broadcast television is overflowing with speculation about how long the network news chiefs will stay on, though no executives have suggested imminent departures. “Everyone is assuming there’s going to be turnover everywhere, and everyone is absolutely terrified about who is going to come in,” one television industry insider said.

This isn’t just the usual revolving door. Newsroom leaders face strong pulls in conflicting directions. Outlets all along the spectrum, from the staid BBC to the radical Intercept, have been moving to reassert final editorial control over their journalists. But newsroom employees — like a generation of workers across many industries — are arriving with heightened demands to be given more of a say in running their companies than in years past. New leaders may find opportunities to resolve some of the heated newsroom battles of the last year, or they may walk into firestorms.

Mr. Pearlstine, the only one talking openly of his departure, told me that the new “metrics for success might be different as well — issues such as inclusiveness, such as being anti-racist, such as really commanding some new platform, be it podcasts or video or newsletters, in addition to having journalistic credentials.”

And, he said, the old top-down newsroom management is a thing of the past. “Consent of the governed is something you have to take pretty seriously,” he said.

Wesley Lowery, a CBS News correspondent who has been a voice for more diverse and politically engaged journalism, said he had already seen signs of change.

“These big institutions very rarely come out and announce some big sweeping change — they say, ‘We’re not changing,’ and they change,” he said. “Even people who made a big deal about how the rebels were wrong are now conceding to the things we all wanted.”

Image
Credit…Erin Schaff/The New York Times

The right-wing cable channel has been riding high as the quasi-official White House network, though it has always been at its strongest when it’s attacking Democrats — who seem poised to take power.

But the approaching election has executives around Lachlan Murdoch, Fox’s chief executive, preparing to battle on several fronts: with left-wing critics, with what senior executives fear could be regulatory retribution from Democrats and perhaps most of all from James Murdoch, Lachlan’s more liberal brother and critic, according to a person familiar with the company’s plans.

And Lachlan Murdoch ends the election cycle as he began it: with no real control of the network’s high-profile talent and an unusually low profile for a figure of his nominal political power. One data point: a surprised patron of the Midtown power lunch spot Estiatorio Milos in late October reported overhearing Mr. Murdoch politely spelling his name to a hostess who didn’t recognize him.

The battles over speech and censorship, the sociologist Zeynep Tufekci tweeted recently, are becoming “attention wars.” As recently as last week, senators were dragging in tech executives to complain about individual tweets, but the arguments are about to turn more consequential. The platforms are increasingly being pushed to disclose how content travels and why — not just what they leave up and what they take down.

“We’re in this brave new world of content moderation that’s outside the take-down/leave-up false binary,” said Evelyn Douek, an expert on the subject and a lecturer at Harvard Law School.

Another way of looking at Substack is as a kind of Twitter Premium — a place you can pay for more content from your favorite journalists. And that synergy has caught the attention of some at Twitter itself, where the notion of acquiring the newsletter company has been discussed internally, a person familiar with the conversations said. (Executives at both companies declined to comment on the speculation.)

But it’s not clear whether Substack will continue to be the venue of choice for all of its stars. Mr. Greenwald wrote that he’d been exploring “the feasibility of securing financing for a new outlet” that would challenge what he sees as the “groupthink” of the left in the Trump era. And roiling anger in Silicon Valley with tough media coverage of companies and investments means there are deep pools of money for a new assault on big media.

“There’s going to be a surge of money after the election, especially from tech bros who think they can fix everything,” said one of the Substack writers who has drawn interest from tech investors.

Image

Credit…Mike Cohen for The New York Times

Nothing good will come of reading political news, much less Twitter, between now and the election. Election week is usually a good time to hide out at the movies, but with theaters closed, you’ll have to find escape elsewhere. Two favorites: The Times’s brilliant Election Distractor on the web; and for your Kindle, Malka Older’s Centenal Cycle, a bit of high-concept political sci-fi that will prepare you for many of the coming tech and political battles.

On election night, however, come to Twitter for the jokes and stay for what is really one of the highlights of American democracy, such as it is: the reassuringly sophisticated, nerdy and nonpartisan vote-counting conversation that you can listen in on among the likes of Mr. Silver, Nate Cohn, Ariel Edwards-Levy and Brandon Finnigan.

Read More

Posted on

Dan Bongino Has No Idea Why Facebook Loves Him

Just a few years ago, Dan Bongino was a B-list pundit working on the fringes of conservative media.

A former police officer and Secret Service agent, Mr. Bongino ran for Congress three times as a Republican. He lost all three races, then turned to punditry, where he had a bit more success. He appeared regularly with Alex Jones on Infowars, then got his own show on NRA TV, the National Rifle Association’s now-defunct online media arm. After the 2016 election, he became one of Fox News’s most prolific contributors — a pro-Trump attack dog who could be called on to defend the president and humiliate his enemies.

“My entire life right now is about owning the libs,” he said in 2018.

Today, Mr. Bongino is owning more libs than almost anyone in America. He has become one of the most popular right-wing commentators in the country, with millions of social media followers, a top-20 podcast, a line of best-selling books and a Facebook page that generates more monthly engagement than the pages of The New York Times, The Washington Post and CNN combined.

I first noticed Mr. Bongino’s profile rising a few months ago, when I started compiling a list of the top-performing Facebook posts every day. He appeared on the lists more often than not, and frequently trounced much better-known commentators like Sean Hannity and Ben Shapiro. (This month, for example, Mr. Bongino has gotten nearly twice as many Facebook interactions as Mr. Shapiro, despite having a much smaller following.)

Mr. Bongino, 45, has become a lightning rod on the left, both because of his growing audience and because he has been criticized for posting exaggerated and misleading information. He was one of the most aggressive promoters of “Spygate,” a dubious conspiracy theory about an illegal Democratic plot to spy on Mr. Trump’s 2016 campaign. He falsely claimed that masks are “largely ineffective” at preventing the spread of Covid-19, and has promoted unproven claims about voter fraud as well as stoking fears about a Democrat-led coup. (Mr. Bongino has claimed that he was merely repeating left-wing claims about post-election violence.)

Plenty of people have fact-checked Mr. Bongino. But nobody has figured out what, exactly, has lifted him above the legions of other pro-Trump influencers battling for attention online.

I called Mr. Bongino the other day, hoping to learn something about how he became Facebook’s biggest right-wing star. But he said he had no idea, either.

“I don’t know what it is,” he said. “The strategy didn’t change at all. I think people just like the message.”

A charming thing about social media — or a terrifying one, depending on your perspective — is that it often creates stars who have no idea how they got there. An Olympic gymnast or a world-class violinist follows a well-worn path, but every day, YouTubers, TikTok stars and Facebook pundits wake up to millions of new followers just because their personas happen to fit into the grooves of a platform’s algorithm.

Granted, Mr. Bongino’s shtick is not exactly new. His brand of right-wing pugilism is similar to what talk-radio hosts like Rush Limbaugh and Mark Levin have been doing for decades. He is good at turning daily culture-war skirmishes into hyperpartisan outrage-bait, with a cast of recurring left-wing villains and right-wing heroes who inevitably show up to dunk on them. (Typical headline: “CNN’s Fredo SCHOOLED On His Brother’s Coronavirus Policies.”) And he is skilled at a certain type of industrial-scale content production that is valuable on today’s internet, flooding social media with a torrent of original posts, remixed memes and videos and found footage.

“We’ll take some interesting clip of maybe the president or Kayleigh McEnany, and we’ll intermingle it with clips of my show, and it seems to work well for us,” he said. “Wherever my content is posted, we just get an incredible response.”

Along with his Facebook page, Mr. Bongino and a small team of writers keep up the Bongino Report, a news aggregator started last year to cater to conservatives who felt that the Drudge Report had become too liberal. He puts out podcast episodes and videos in which he rants against the “deep state,” decries the “Russia hoax,” and promotes spurious claims about Hunter Biden’s laptop — all fairly standard Fox News narratives, repackaged for a Facebook audience.

Mr. Bongino’s popularity began to spike during this spring’s Covid-19 lockdowns, as election season began to heat up and QAnon, the pro-Trump conspiracy movement, grew in popularity. (Mr. Bongino is not a QAnon promoter, but his content is popular with the movement’s supporters.)

Unlike Mr. Shapiro, whose website, The Daily Wire, was caught using a network of affiliated Facebook pages to generate traffic, Mr. Bongino swears he has “absolutely, categorically, 100 percent never” used any underhanded tactics to boost his Facebook presence.

“We don’t use bots,” he said. “We don’t even advertise much on Facebook.”

He credits his popularity, instead, to Facebook’s older and more conservative user base — and to the writers who work for him, who “have almost made a cottage industry” of understanding the platform’s algorithms, he said.

Like Mr. Trump, Mr. Bongino is a frequent critic of Facebook and other Silicon Valley tech companies, which he believes are censoring conservatives. His own posts have been flagged several times by Facebook’s third-party fact-checkers, and he said it was only a matter of time before the social network cracked down on him more aggressively.

“I’m anticipating being banned from Facebook,” he said. “They’ll ban me, or use some excuse to throttle my page. It’s going to have nothing to do with facts. It’s going to be ideological.”

It’s hard to square Mr. Bongino’s concerns about right-wing censorship with the incredible performance of his page. Still, he is making backup plans. He has invested in Parler and Rumble, two start-ups building “free speech alternatives” to Twitter and YouTube, respectively, and has begun posting his content there as well as on the larger networks.

Mr. Bongino, who was recently found to have lymphoma, allowed that Facebook had been a “pretty good business partner,” despite his disagreements with the company’s fact-checkers. And he maintained that he had no secret sauce — no growth-hacking strategy, no shortcuts, no networks of unlabeled pages funneling clicks to his posts. Mostly, he seems to be succeeding by catering to a large and hyper-engaged audience of Facebook conservatives, while being slightly more cautious than other right-wing pundits not to run afoul of Facebook’s rules. He said he didn’t even take advantage of Facebook’s analytics tools, which allow creators to get a fine-tuned sense of what their audience wants to see.

“If I told you I spent 10 minutes on analytics over the past year, I’d be lying,” he said. “I have no idea who’s watching, I just know it’s a whole lot of whos.”

Read More

Posted on

Big Tech Continues Its Surge Ahead of the Rest of the Economy

While the rest of the U.S. economy languished earlier this year, the tech industry’s biggest companies seemed immune to the downturn, surging as the country worked, learned and shopped from home.

On Thursday, as the economy is showing signs of improvement, Amazon, Apple, Alphabet and Facebook reported profits that highlighted how a recovery may provide another catalyst to help them generate a level of wealth that hasn’t been seen in a single industry in generations.

With an entrenched audience of users and the financial resources to press their leads in areas like cloud computing, e-commerce and digital advertising, the companies demonstrated again that economic malaise, upstart competitors and feisty antitrust regulators have had little impact on their bottom line.

Combined, the four companies reported a quarterly net profit of $38 billion.

Amazon reported record sales, and an almost 200 percent rise in profits, as the pandemic accelerated the transition to online shopping. Despite a boycott of its advertising over the summer, Facebook had another blockbuster quarter. Alphabet’s record quarterly net profit was up 59 percent, as marketers plowed money into advertisements for Google search and YouTube. And Apple’s sales rose even though the pandemic forced it to push back the iPhone 12’s release to October, in the current quarter.

On Tuesday, Microsoft, Amazon’s closest competitor in cloud computing, also reported its most profitable quarter, growing 30 percent from a year earlier.

“The scene that’s playing out fundamentally is that these tech stalwarts are gaining more market share by the day,” said Dan Ives, managing director of equity research at Wedbush Securities. “It’s ‘A Tale of Two Cities’ for this group of tech companies and everyone else.”

The results were strong despite increasing antitrust scrutiny from regulators. Last week, the Justice Department filed a lawsuit accusing Google of cementing the dominance of its search engine through anticompetitive agreements with device makers and mobile carriers. Facebook faces a possible antitrust case from the Federal Trade Commission.

The companies’ advantages are becoming more pronounced in an economy starting to dig out from the coronavirus pandemic. On Thursday, the Commerce Department said U.S. economic output grew 7.4 percent last quarter, the fastest pace on record, but remained below where it was in the last pre-pandemic quarter.

That slow return to health is also providing momentum to companies that suffered early in the pandemic, like Twitter, which reported on Thursday that revenue rose 14 percent in the third quarter as advertisers started to return. Twitter’s stock dropped about 14 percent in after-hours trading on Thursday, a reaction that analysts attributed to slow user growth.

Big Tech’s third-quarter boom could look modest when compared with the final quarter of the year. For Apple, it’s when consumers buy newly released iPhones. And the year-end shopping peak means lots of customers turning to Amazon for gifts, while advertisers rely on Google and Facebook for digital ads during the holidays.

The pandemic-fueled surge in online shopping pushed Amazon to a record for both sales and profits in the latest quarter.

Sales were $96.1 billion, up 37 percent from a year earlier, and profits rose to $6.3 billion.

The quarter did not include the usual boost from Prime Day, Amazon’s yearly deal bonanza, which was delayed to October. And the profit increased during a building boom, with Amazon expanding its fulfillment infrastructure by 50 percent this year. The company added almost 250,000 employees in the quarter, for the first time surpassing more than a million workers.

The lucrative Amazon Web Services division grew 29 percent as companies continued their shift to cloud computing.

Amazon said sales could reach $121 billion in the fourth quarter because of the confluence of Prime Day, the holiday shopping season and the turn to online spending.

The delay in the iPhone 12’s release meant Apple would face a tough comparison with the same quarter last year, which included sales of the iPhone 11. As a result, iPhone sales dropped more than 20 percent in the quarter.

Image
Credit…Michael M. Santiago/Getty Images

Yet Apple’s overall sales still rose 1 percent to $64.7 billion, showing the increasing strength of other parts of the company’s business.

Apple’s services segment, which includes revenues from the App Store and offerings like Apple Music, increased 16 percent to $14.5 billion. Sales rose 46 percent for iPads, 29 percent for Mac computers and 21 percent for wearables.

Profits fell 7 percent to $12.7 billion, partly because the company spent more on research and development.

“There are lots going on here, and everything is going incredibly well,” Luca Maestri, Apple’s finance chief, said in an interview.

Facebook’s revenue for the third quarter rose 22 percent from a year earlier, to $21.2 billion, while profits jumped 29 percent to $7.84 billion. The results surpassed analysts’ estimates of $19.8 billion in revenue and profits of $5.53 billion, according to data provided by FactSet.

Facebook had strong results despite a wide-ranging boycott by advertisers this summer over issues of hate and toxic speech on the site. Though the grass-roots campaign, Stop Hate for Profit, rallied many of the top advertisers on Facebook to reduce their spending, the overall effects were brief.

The company continued gaining users as well. More than 1.82 billion people used the Facebook app every day, up 12 percent from a year earlier, it said. More than 2.54 billion people now use one or more of Facebook’s family of apps — Instagram, WhatsApp, Messenger or Facebook — daily, up 15 percent from a year earlier.

After its first-ever decline in quarterly revenue in the second quarter, Alphabet rebounded with its highest-ever profit. The strength came from across Google, with search advertising revenue growing 6 percent and YouTube ad spending rising 32 percent. Google’s cloud computing business grew 45 percent.

When advertisers slowed spending with Google this year as Covid-19 started to spread, Alphabet’s business took a significant hit. But as the economy has improved and businesses found their footing, advertisers have returned.

Alphabet posted a net profit of $11.25 billion in the third quarter as revenue rose 14 percent to $46.1 billion. Ruth Porat, Alphabet’s chief financial officer, said the improved profitability reflected efforts to cut costs during the economic downturn, including a hiring slowdown.

Posted on

F.T.C. Decision on Pursuing Facebook Antitrust Case Is Said to Be Near

WASHINGTON — The Federal Trade Commission is moving closer to a decision about filing an antitrust lawsuit against Facebook for its market power in social networking, according to two people with knowledge of the agency’s talks.

The five members of the F.T.C. met on Thursday to discuss its investigation into Facebook and whether the company had bought smaller rivals to maintain a monopoly, the people said. They said three documents about Facebook had been prepared by the agency and circulated among its leaders: One addresses the company’s potential antitrust violations, another analyzes its economics, and a third assesses the risks of litigation.

No decision on a case has been made, the people said. The commissioners must vote before any case is pursued.

Facebook and the F.T.C. declined to comment. The Washington Post reported earlier that the commission met about the Facebook investigation on Thursday.

Lawmakers and policymakers in Washington have ramped up antitrust actions against the largest technology companies, often in a bipartisan effort. On Tuesday, the Justice Department sued Google, accusing it of illegally maintaining its monopoly power in search and search advertising — the first such government action against a tech company in two decades. Two weeks ago, the House Judiciary Committee recommended taking action to break up the big tech platforms, including Facebook, Amazon, Apple and Google.

The actions reflect growing frustration toward the companies, which total around $5 trillion in value and have transformed commerce, speech, media and advertising globally. That power has drawn the scrutiny of conservatives like President Trump and liberals like Senator Elizabeth Warren of Massachusetts.

The U.S. investigations began last year when the Justice Department started examining Google and other tech companies. Joseph Simons, the chairman of the F.T.C. and a Trump appointee, also opened an investigation into Facebook in June 2019. Around the same time, four dozen state attorneys general began a parallel investigation into the social network.

Facebook has tangled with the F.T.C. before, but mainly over privacy issues. The company reached a privacy settlement in 2011 with the agency. In 2018, the F.T.C. opened an investigation into Facebook for violating that settlement, prompted by a report from The New York Times and The Observer of London on how the company allowed Cambridge Analytica, a British consulting firm to the Trump campaign, to harvest the personal information of its users. As a result, Facebook last year agreed to a record $5 billion settlement with the F.T.C. on data privacy violations.

The antitrust investigation by the F.T.C. has been far-reaching. The agency has collected thousands of internal documents from Facebook’s leaders. It has also interviewed people from the company’s rivals, such as Snap, which owns the Snapchat app, about Facebook’s dominant position in social networking and its business practices.

In August, Mark Zuckerberg, Facebook’s chief executive, answered questions under oath as part of the inquiry.

The company has denied violations of antitrust laws. It points to competition in online social networks, including the fast rise of the Chinese-owned viral video app TikTok, as proof that it does not have a lock on the market.

But with nearly three billion users across its apps and a market value of $792 billion, Facebook is unrivaled in size among social networking apps. Part of its dominance has been due to acquisitions of smaller rivals. Facebook bought the photo-sharing app Instagram for $1 billion in 2012. It bought WhatsApp, the messaging app, for $19 billion in 2014. Both mergers were approved by the F.T.C.

The commission’s investigation has largely focused on Facebook’s mergers with companies like Instagram and WhatsApp, people with knowledge of the inquiry said. The deals remove competition from the market and have bolstered Facebook’s reach and clout, its critics have said.

In a July antitrust hearing with House lawmakers, Mr. Zuckerberg was confronted with emails showing that a Facebook executive had referred to Instagram during the acquisition process as a “competitive threat.” Mr. Zuckerberg said Instagram’s success was due to Facebook.

But his answer did not appear to satisfy House lawmakers.

In an antitrust report this month, staff of the House Judiciary Committee said Facebook’s power in social networking was so immense that the company “has tipped the market toward monopoly such that Facebook competes more vigorously among its own products — Facebook, Instagram, WhatsApp and Messenger — than with actual competitors.”