Posted on

Twitter, Facebook and YouTube Survived Election Day. More Tests Loom.

OAKLAND, Calif. — For months, Twitter, Facebook and YouTube prepared to clamp down on misinformation on Election Day.

On Tuesday, most of their plans went off without a hitch. The social platforms added labels to misleading posts by President Trump and notified their users that there was no immediate outcome to the presidential race. On television, news anchors even cited fact-checks similar to those made by Twitter and Facebook.

Then came Wednesday. With ballots still being counted and the absence of a clear result, the flow of misinformation shifted away from seeding doubts about the vote to false claims of victory. Twitter rapidly labeled several tweets by Mr. Trump over the course of the day as being misleading about the result of his race, and also did the same to tweets from others in his circle, such as Eric Trump and the White House press secretary, Kayleigh McEnany. And Facebook and YouTube used their home pages to show people accurate information about the election.

The actions reinforced how even a smooth performance on Election Day did not mean that the social media companies could relax, fighting a relentless flow of toxic content. In fact, the biggest tests for Facebook, Twitter and YouTube are still looming, misinformation researchers said, as false narratives may surge until a final result in the presidential race is certified.

“What we actually saw on Election Day from the companies is that they were extremely responsive and faster than they’ve ever been,” said Graham Brookie, the director of the Atlantic Council’s Digital Forensic Research Lab. But now, he said, misinformation was solely focused on the results and undermining them.

“You have a hyperfocused audience and a moment in time where there is a huge amount of uncertainty, and bad actors can use that opportunistically,” he said.

Twitter said it was continuing to monitor for misinformation. Facebook said, “Our work isn’t done — we’ll stay vigilant and promote reliable information on Facebook as votes continue to be counted.” YouTube said it also was on alert for “election-related content” in the coming days.

The companies had all braced for a chaotic Election Day, working to avoid a repeat of 2016, when their platforms were misused by Russians to spread divisive disinformation. In recent months, the companies had rolled out numerous anti-misinformation measures, including suspending or banning political ads, slowing down the flow of information and highlighting accurate information and context.

Image

On Tuesday, as Americans voted across the country, falsehoods about broken voting machines and biased poll workers popped up repeatedly. But the companies weren’t tested until Mr. Trump — with early results showing how tight the race was — posted on Twitter and Facebook just before 1 a.m. Eastern time to baselessly lash out at the electoral process.

“They are trying to STEAL the Election,” Mr. Trump posted on the sites, without being specific about who he meant.

Twitter moved quickly, hiding Mr. Trump’s inaccurate tweet behind a label that cautioned people that the claim was “disputed” and “might be misleading about an election or other civic process.” Twitter, which had started labeling Mr. Trump’s tweets for the first time in May, also restricted users’ ability to like and share the post.

On Wednesday morning, Twitter added more labels to posts from Mr. Trump. In one, he tweeted that his early leads in Democratic states “started to magically disappear.” In another message, Mr. Trump said unnamed people were working to make his lead in the battleground state of Pennsylvania “disappear.”

Twitter also applied other labels to posts that falsely asserted victory. One was added to a post by Ben Wikler, head of the Democratic Party of Wisconsin, in which he asserted prematurely that Joseph R. Biden Jr. had won the state. The Associated Press and other news outlets later called Wisconsin for Mr. Biden, though Mr. Trump called for a recount.

On Wednesday afternoon, Twitter also affixed context to tweets from Eric Trump, one of Mr. Trump’s sons, and Ms. McEnany when they preemptively claimed that Mr. Trump had won in Pennsylvania, even though the race there had not been called. The company also fact-checked other assertions from Mr. Trump claiming victory in several battleground states such as North Carolina and Georgia, where the race has not been called, and restricted his false statements about voter fraud from being shared.

“As votes are still being counted across the country, our teams continue to take enforcement action on tweets that prematurely declare victory or contain misleading information about the election broadly,” Twitter said.

Facebook took a more cautious approach. Mark Zuckerberg, its chief executive, has said he has no desire to fact-check the president or other political figures because he believes in free speech. Yet to prevent itself from being misused in the election, Facebook said it would couch premature claims of victory with a notification that the election had yet to be called for a candidate, if necessary.

Image

Unlike Twitter, Facebook did not restrict users from sharing or commenting on Mr. Trump’s posts. But it was the first time Facebook had used such labels, part of the company’s plan to add context to posts about the election. A spokesman said the company “planned and prepared for these scenarios and built the essential systems and tools.”

YouTube, which is not used regularly by Mr. Trump, faced fewer high-profile problems than Twitter and Facebook. All YouTube videos about election results included a label that said the election might not be over and linked to a Google page with results from The Associated Press.

But the site did encounter a problem early on Tuesday night when several YouTube channels, one with more than a million subscribers, said they were livestreaming election results. What the live streams actually showed was a graphic of a projection of an election outcome with Mr. Biden leading. They were also among the first results that appeared when users searched for election results.

After media reports pointed out the issue, YouTube removed the video streams, citing its policy prohibiting spam, deceptive practices and scams.

On Wednesday, One America News Network, a conservative cable news network with nearly a million subscribers on YouTube, also posted a video commentary to the site claiming that Mr. Trump had already won the election and that Democrats were “tossing Republican ballots, harvesting fake ballots and delaying results” to cause confusion. The video has been viewed more than 280,000 times.

Farshad Shadloo, a YouTube spokesman, said the video did not violate the company’s policy regarding misleading claims about voting. He said the video carried a label that the election results were not final. YouTube added that it had removed ads from the video because it did not allow creators to make money off content that undermined “confidence in elections with demonstrably false information.”

Alex Stamos, director of the Stanford Internet Observatory, said the tech companies still had a fight ahead against election misinformation, but were prepared for it.

“There will always be a long tail of disinformation, but it will become less impactful,” he said. “They are still working, for sure, and will try to maintain this staffing level and focus until the outcome is generally accepted.”

But Fadi Quran, campaign director at Avaaz, a progressive nonprofit that tracks misinformation, said Facebook, Twitter and YouTube needed to do more.

“Platforms need to quickly expand their efforts before the country is plunged into further chaos and confusion,” he said. “It is a democratic emergency.”

Posted on

Google Antitrust Fight Thrusts Low-Key C.E.O. Into the Line of Fire

OAKLAND, Calif. — When Sundar Pichai succeeded Larry Page as the head of Google’s parent company in December, he was handed a bag of problems: Shareholders had sued the company, Alphabet, over big financial packages handed to executives accused of misconduct. An admired office culture was fraying. Most of all, antitrust regulators were circling.

On Tuesday, the Justice Department accused Google of being “a monopoly gatekeeper of the internet,” one that uses anticompetitive tactics to protect and strengthen its dominant hold over web search and search advertising.

Google, which has generated vast profits through a recession, a pandemic and earlier investigations by government regulators on five continents, now faces the first truly existential crisis in its 22-year history.

The company’s founders, Mr. Page and Sergey Brin, have left the defense to the soft-spoken Mr. Pichai, who has worked his way up the ranks over 16 years with a reputation for being a conscientious caretaker rather than an impassioned entrepreneur.

Mr. Pichai, a former product manager, may seem an unlikely candidate to lead his company’s fight with the federal government. But if the tech industry’s bumptious history with antitrust enforcement is any lesson, a caretaker who has reluctantly stepped into the spotlight might be preferable to a charismatic leader born to it.

Mr. Pichai, 48, is expected to make the case — as he has for some time — that the company is not a monopoly even though it has a 92 percent global market share of internet searches. Google is good for the country, so goes the corporate message, and has been a humble economic engine — not a predatory job killer.

“He has to come off as an individual who is trying to do the right thing not only for his company but broader society,” said Paul Vaaler, a business and law professor at the University of Minnesota. “If he comes off as evasive, petulant and a smart aleck, this is going to be a killer in front of the court and the court of public opinion.”

Google declined to make Mr. Pichai available for an interview. In an email to employees on Tuesday, he urged Google employees to stay focused on their work so that users will continue to use its products not because they have to but because they want to.

“Scrutiny is nothing new for Google, and we look forward to presenting our case,” Mr. Pichai wrote. “I’ve had Googlers ask me how they can help, and my answer is simple: Keep doing what you’re doing.”

Few executives have faced a challenge like this, and the most iconic figures in the technology industry have wilted under the glare of antitrust scrutiny.

Image
Credit…Stephen Crowley/The New York Times

Bill Gates, who was chief executive of Microsoft in the last big technology antitrust case brought by the Justice Department two decades ago, came across as combative and evasive in depositions, reinforcing the view that the company was a win-at-all-costs bully. Mr. Gates said last year that the lawsuit had been such a “distraction” that he “screwed up” the transition to mobile phone software and ceded the market to Google.

Mr. Page dealt with impending antitrust scrutiny with detachment, spending his time on futuristic technology projects instead of huddling with lawyers. Even as the European Union handed down three fines against Google for anticompetitive practices, Mr. Page barely addressed the matter publicly.

On a conference call with reporters on Tuesday, officials at the Justice Department declined to reveal whether they had spoken to Mr. Page during its investigation.

In its complaint, the Justice Department, along with 11 states, said Google had foreclosed competition in the search market by striking deals with handset manufacturers, including Apple, and mobile carriers to block rivals from competing effectively.

“For the sake of American consumers, advertisers and all companies now reliant on the internet economy, the time has come to stop Google’s anticompetitive conduct and restore competition,” the complaint said.

Google said that the case was “deeply flawed” and that the Justice Department was relying on “dubious antitrust arguments.”

Google is also the target of an antitrust inquiry by state attorneys general looking into its advertising technology and web search. And Europe continues to investigate the company over its data collection even after the three fines since 2017, totaling nearly $10 billion.

At Mr. Pichai’s side are senior executives who are also inclined to strike an accommodating tone. He has surrounded himself with other serious, buttoned-up career Google managers who bring a lot of boring to the table.

The point person for handling the case is Kent Walker, Google’s chief legal officer and head of global affairs. Though Mr. Walker, who worked at the Justice Department as an assistant U.S. attorney and joined Google in 2006, oversees many of the company’s messiest issues, he rarely makes headlines — a testament, current and former colleagues said, to his lawyerly pragmatism.

Image

Credit…Jim Wilson/The New York Times

Google has appointed Halimah DeLaine Prado as its new general counsel. A 14-year veteran of the company’s legal department, Ms. Prado was most recently a vice president overseeing the global team that advised Google on products including advertising, cloud computing, search, YouTube and hardware. While Ms. Prado doesn’t have a background in antitrust, she has been at Google since 2006 and is, by now, well versed in competition law.

The company is expected to rely heavily on its high-priced law firms to help manage the battle, including Wilson Sonsini Goodrich & Rosati, a top Silicon Valley firm, and Williams & Connolly, which has defended Google in other competition law cases.

Wilson Sonsini has represented Google from the company’s inception and helped it defend itself in a Federal Trade Commission investigation into its search business. In 2013, the agency chose not to bring charges.

Regardless of the legal argument for prosecuting Google as a monopoly, the case may shape the public perception of the company long after it has been resolved.

Until now, Google’s public posture has been a shrug. Mr. Pichai has said that the antitrust scrutiny is nothing new and that, if anything, the company welcomes the look into its business practices. Google has argued that it competes in rapidly changing markets, and that its dominance can evaporate quickly with the emergence of new rivals.

“Google operates in highly competitive and dynamic global markets, in which prices are free or falling and products are constantly improving,” Mr. Pichai said in his opening remarks to a House antitrust panel in July. “Google’s continued success is not guaranteed.”

Image

Credit…Pool photo by Graeme Jennings

Mr. Pichai is familiar with the machinations of antitrust proceedings. In 2009, when he was a vice president of product management, he lobbied the European competition authorities to take action on Microsoft’s Internet Explorer web browser.

“We are confident that more competition in this space will mean greater innovation on the web and a better user experience for people everywhere,” Mr. Pichai wrote in a blog post at the time, sentiments that search rivals say about Google today.

But shortly after he became Google’s chief executive in 2015, Mr. Pichai displayed his tendency for pragmatism when he buried the hatchet with Microsoft. The two companies agreed to stop complaining to regulators about each other.

Early in his tenure running Google, Mr. Pichai was reluctant to press its case in Washington — a job that one of his predecessors, Eric Schmidt, had reveled in. Mr. Schmidt, a big donor in Democratic politics, was a frequent visitor to the White House during the Obama presidency and served on the President’s Council of Advisors on Science and Technology.

In 2018, Google declined to send Mr. Pichai to testify at a Senate Intelligence Committee hearing on Russian interference in the 2016 presidential election. Annoyed senators left an empty seat for the company’s representative next to executives from Facebook and Twitter. (Mr. Page was also invited to testify, but there was never any expectation from people within the company that he would.)

Since then, Mr. Pichai has made frequent trips to Washington, testified at other congressional hearings and held meetings with President Trump.

Image

Credit…Tom Brenner for The New York Times

Microsoft’s long battle with the government has also influenced how Google plans to wage its antitrust fight. Many Google executives believe Microsoft was too combative with the Justice Department, bringing the company to a standstill.

For most of the last decade, even as Google has dealt with antitrust investigations in the United States and Europe, the company has continued expanding into new businesses and acquire companies, such as the fitness tracker maker Fitbit last year.

Now the bill for that growth may have come due. And like it or not, it has been left to Mr. Pichai. Mr. Page, who is a year younger than Mr. Pichai and who Forbes says is worth $65 billion, is pursuing other interests.

Mr. Pichai “hasn’t had to deal with anything of this magnitude,” said Michael Cusumano, a professor and deputy dean at the Massachusetts Institute of Technology’s Sloan School of Management. “He has to face the government. He has no choice.”

Posted on

YouTube Cracks Down on QAnon Conspiracy Theory

YouTube on Thursday became the latest social media giant to take steps to stop QAnon, the sprawling pro-Trump conspiracy theory community whose online fantasies about a cabal of satanic pedophiles running the world have spilled over into offline violence.

The company announced in a blog post that it was updating its hate speech and harassment policies to prohibit “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” The new policy will prohibit content promoting QAnon, as well as related conspiracy theories such as Pizzagate, which falsely claims that top Democrats and Hollywood elites are running an underground sex-trafficking ring from the basement of a Washington pizza restaurant.

Other social networks have also taken steps to curb the spread of QAnon, which has been linked to incidents of violence and vandalism. Last week, Facebook hardened its rules related to QAnon content and compared it to a “militarized social movement” that was becoming increasingly violent. This week, several smaller platforms, including Pinterest, Etsy and Triller, also announced new restrictions on QAnon content.

Under YouTube’s new policy, which goes into effect today, “content that threatens or harasses someone by suggesting they are complicit” in a harmful theory like QAnon or Pizzagate will be banned. News coverage of these theories and videos that discuss the theories without targeting individuals or groups may still be allowed.

The QAnon movement began in 2017, when an anonymous poster under the handle “Q Clearance Patriot,” or “Q,” began posting cryptic messages on 4chan, the notoriously toxic message board, claiming to possess classified information about a secret battle between President Trump and a global cabal of pedophiles. QAnon believers — known as “bakers” — began discussing and decoding them in real time on platforms including Reddit and Twitter, connecting the dots on a modern rebranding of centuries-old anti-Semitic tropes that falsely accused prominent Democrats, including Hillary Clinton and the liberal financier George Soros, of pulling the strings on a global sex trafficking conspiracy.

Few platforms played a bigger role in moving QAnon from the fringes to the mainstream than YouTube. In the movement’s early days, QAnon followers produced YouTube documentaries that offered an introductory crash course in the movement’s core beliefs. The videos were posted on Facebook and other platforms, and were often used to draw new recruits. Some were viewed millions of times.

QAnon followers also started YouTube talk shows to discuss new developments related to the theory. Some of these channels amassed large audiences and made their owners prominent voices within the movement.

“YouTube has a huge role in the Q mythology,” said Mike Rothschild, a conspiracy theory debunker who is writing a book about QAnon. “There are major figures in the Q world who make videos on a daily basis, getting hundreds of thousands of views and packaging their theories in slick clips that are a world away from the straight-to-camera rambles so prominent in conspiracy theory video making.”

YouTube has tried for years to curb the spread of misinformation and conspiracy theories on its platform, and tweak the recommendations algorithm that was sending millions of viewers to what it considered low-quality content. In 2019, the company began to demote what it called “borderline content” — videos that tested its rules, but didn’t quite break them outright — and reduce the visibility of those videos in search results and recommendations.

The company says that these changes have decreased by more than 70 percent the number of views borderline content gets from recommendations, although that figure cannot be independently verified. YouTube also says that among a set of pro-QAnon channels, the number of views coming from recommendations dropped by more than 80 percent following the 2019 policy change.

Social media platforms have been under scrutiny for their policy decisions in recent weeks, as Democrats accuse them of doing too little to stop the spread of right-wing misinformation, and Republicans, including President Trump, paint them as censorious menaces to free speech.

YouTube, which is owned by Google, has thus far stayed mostly out of the political fray despite the platform’s enormous popularity — users watch more than a billion hours of YouTube videos every day — and the surfeit of misinformation and conspiracy theories on the service. Its chief executive, Susan Wojcicki, has not been personally attacked by Mr. Trump or had to testify to Congress, unlike Jack Dorsey of Twitter and Mark Zuckerberg of Facebook.

Vanita Gupta, the chief executive of the Leadership Conference on Civil and Human Rights, a coalition of civil rights groups, praised YouTube’s move to crack down on QAnon content.

“We commend YouTube for banning this harmful and hateful content that targets people with conspiracy theories used to justify violence offline, particularly through efforts like QAnon,” Ms. Gupta said. “This online content can result in real-world violence, and fosters hate that harms entire communities.”

Mr. Rothschild, the QAnon researcher, predicted that QAnon believers who were kicked off YouTube would find ways to distribute their videos through smaller platforms. He also cautioned that the movement’s followers were known for trying to evade platform bans, and that YouTube would have to remain vigilant to keep them from restarting their channels and trying again.

“YouTube banning Q videos and suspending Q promoters is a good step,” he said, “but it won’t be the end of Q. Nothing has been so far.”

Read More

Posted on

How to Deal With a Crisis of Misinformation

There’s a disease that has been spreading for years now. Like any resilient virus, it evolves to find new ways to attack us. It’s not in our bodies, but on the web.

It has different names: misinformation, disinformation or distortions. Whatever the label, it can be harmful, especially now that it is being produced through the lens of several emotionally charged events: the coronavirus pandemic, a presidential election and protests against law enforcement.

The swarm of bad information circulating on the web has been intense enough to overwhelm Alan Duke, the editor of Lead Stories, a fact-checking website. For years, he said, false news mostly consisted of phony web articles that revolved around silly themes, like myths about putting onions in your socks to cure a cold. But misinformation has now crept into much darker, sinister corners and taken on forms like the internet meme, which is often a screenshot overlaid with sensational text or manipulated with doctored images.

He named a harmful example of memes: Those attacking Breonna Taylor, the Black medical worker in Louisville, Ky., who was killed by the police when they entered her home in March. Misinformation spreaders generated memes suggesting that Ms. Taylor shot at police officers first, which was not true.

“The meme is probably the most dangerous,” Mr. Duke said. “In seven or 20 words, somebody can say something that’s not true, and people will believe it and share it. It takes two minutes to create.”

It’s impossible to quantify how much bad information is out there now because the spread of it online has been relentless. Katy Byron, who leads a media literacy program at the Poynter Institute, a journalism nonprofit, and who works with a group of teenagers who regularly track false information, said it was on the rise. Before the pandemic, the group would present a few examples of misinformation every few days. Now each student is reporting multiple examples a day.

“With the pandemic, people are increasingly online doomscrolling and looking for information,” Ms. Byron said. “It’s getting harder and harder to find it and feel confident you’re consuming facts.”

The misinformation, she said, is also creeping into videos. With modern editing tools, it has become too easy for people with little technical know-how and minimal equipment to produce videos that appear to have high production value. Often, real video clips are stripped of context and spliced together to tell a different story.

The rise of false news is bad news for all of us. Misinformation can be a detriment to our well-being in a time when people are desperately seeking information such as health guidelines to share with their loved ones about the coronavirus. It can also stoke anger and cause us to commit violence. Also important: It could mislead us about voting in a pandemic that has turned our world upside down.

How do we adapt to avoid being manipulated and spreading false information to the people we care about? Past methods of spotting untruthful news, like checking articles for typos and phony web addresses that resemble those of trusted publications, are now less relevant. We have to employ more sophisticated methods of consuming information, like doing our own fact-checking and choosing reliable news sources.

Here’s what we can do.

Get used to this keyboard shortcut: Ctrl+T (or Command+T on a Mac). That creates a new browser tab in Chrome and Firefox. You’re going to be using it a lot. The reason: It enables you to ask questions and hopefully get some answers with a quick web search.

It’s all part of an exercise that Ms. Byron calls lateral reading. While reading an article, Step 1 is to open a browser tab. Step 2 is to ask yourself these questions:

  • Who is behind the information?

  • What is the evidence?

  • What do other sources say?

From there, with that new browser tab open, you could start answering those questions. You could do a web search on the author of the content when possible. You could do another search to see what other publications are saying about the same topic. If the claim isn’t being repeated elsewhere, it may be false.

You could also open another browser tab to look at the evidence. With a meme, for example, you could do a reverse image search on the photo that was used in the meme. On Google.com, click Images and upload the photo or paste the web address of the photo into the search bar. That will show where else the image has shown up on the web to verify whether the one you have seen has been manipulated.

With videos, it’s trickier. A browser plug-in called InVID can be installed on Firefox and Chrome. When watching a video, you can click on the tool, click on the Keyframes button and paste in a video link (a YouTube clip, for example) and click Submit. From there, the tool will pull up important frames of the video, and you can reverse image search on those frames to see if they are legitimate or fake.

Some of the tech steps above may not be for the faint of heart. But most important is the broader lesson: Take a moment to think.

“The No. 1 rule is to slow down, pause and ask yourself, ‘Am I sure enough about this that I should share it?’” said Peter Adams, a senior vice president of the News Literacy Project, a media education nonprofit. “If everybody did that, we’d see a dramatic reduction of misinformation online.”

While social media sites like Facebook and Twitter help us stay connected with the people we care about, there’s a downside: Even the people we trust may be unknowingly spreading false information, so we can be caught off guard. And with everything mashed together into a single social media feed, it gets tougher to distinguish good information from bad information, and fact from opinion.

What we can do is another exercise in mindfulness: Be deliberate about where you get your information, Mr. Adams said. Instead of relying solely on the information showing up in your social media feeds, choose a set of publications that you trust, like a newspaper, a magazine or a broadcast news program, and turn to those regularly.

Mainstream media is far from perfect, but it’s subjected to a standards process that is usually not seen in user-generated content, including memes.

“A lot of people fall into the trap of thinking no source of information is perfect,” Mr. Adams said. “That’s when people really start to feel lost and overwhelmed and open themselves up to sources they really should stay away from.”

The most frightening part about misinformation is when it transcends digital media and finds its way into the real world.

Mr. Duke of Lead Stories said he and his wife had recently witnessed protesters holding signs with the message “#SavetheChildren.” The signs alluded to a false rumor spread by supporters of the QAnon conspiracy about a child-trafficking network led by top Democrats and Hollywood elites. The pro-Trump conspiracy movement had effectively hijacked the child-trafficking issue, mixing facts with its own fictions to suit its narrative.

Conspiracy theories have fueled some QAnon believers to be arrested in cases of serious crimes, including a murder in New York and a conspiracy to kidnap a child.

“QAnon has gone from misinformation online to being out on the street corner,” he said. “That’s why I think it’s dangerous.”

Posted on

Riled Up: Misinformation Stokes Calls for Violence on Election Day

In a video posted to Facebook on Sept. 14, Dan Bongino, a popular right-wing commentator and radio host, declared that Democrats were planning a coup against President Trump on Election Day.

For just over 11 minutes, Mr. Bongino talked about how bipartisan election experts who had met in June to plan for what might happen after people vote were actually holding exercises for such a coup. To support his baseless claim, he twisted the group’s words to fit his meaning.

“I want to warn you that this stuff is intense,” Mr. Bongino said, speaking into the camera to his 3.6 million Facebook followers. “Really intense, and you need to be ready to digest it all.”

His video, which has been viewed 2.9 million times, provoked strong reactions. One commenter wrote that people should be prepared for when Democrats “cross the line” so they could “show them what true freedom is.” Another posted a meme of a Rottweiler about to pounce, with the caption, “Veterans be like … Say when Americans.”

Image

The coup falsehood was just one piece of misinformation that has gone viral in right-wing circles ahead of Election Day on Nov. 3. In another unsubstantiated rumor that is circulating on Facebook and Twitter, a secret network of elites was planning to destroy the ballots of those who voted for President Trump. And in yet another fabrication, supporters of Mr. Trump said that an elite cabal planned to block them from entering polling locations on Election Day.

All of the rumors appeared to be having the same effect: Of riling up Mr. Trump’s restive base, just as the president has publicly stoked the idea of election chaos. In comment after comment about the falsehoods, respondents said the only way to stop violence from the left was to respond in kind with force.

“Liberals and their propaganda,” one commenter wrote. “Bring that nonsense to country folks who literally sit in wait for days to pull a trigger.”

The misinformation, which has been amplified by right-wing media such as the Fox News host Mark Levin and outlets like Breitbart and The Daily Wire, adds contentiousness to an already powder-keg campaign season. Mr. Trump has repeatedly declined to say whether he would accept a peaceful transfer of power if he lost to his Democratic challenger, Joseph R. Biden Jr., and has urged his supporters “to go into the polls and watch very carefully.”

The falsehoods on social media are building support for the idea of disrupting the election. Election officials have said they fear voter harassment and intimidation on Election Day.

Image

Credit….

“This is extremely concerning,” said Megan Squire, a computer science professor at Elon University in Elon, N.C., who tracks extremists online. Combined with Mr. Trump’s comments, the false rumors are “giving violent vigilantes an excuse” that acting out in real life would be “in defense of democracy,” she said.

Tim Murtaugh, a Trump campaign spokesman, said Mr. Trump would “accept the results of an election that is free, fair and without fraud” and added that the question of violence was “better put to Democrats.”

Keep up with Election 2020

In a text message, Mr. Bongino said the idea of a Democratic coup was “not a rumor” and that he was busy “exposing LIBERAL violence.”

Distorted information about the election is also flowing in left-wing circles online, though to a lesser degree, according to a New York Times analysis. Such misinformation includes a viral falsehood that mailboxes were being blocked by unknown actors to effectively discourage people from voting.

Other popular leftist sites, like Liberal Blogger and The Other 98%, have also twisted facts to push a critical narrative about Republicans, according to PolitiFact, a fact-checking website. In one inflammatory claim last week, for instance, the left-wing Facebook page Occupy Democrats asserted that President Trump had directly inspired a plot by a right-wing group to kidnap Gov. Gretchen Whitmer of Michigan.

Social media companies appear increasingly alarmed by how their platforms may be manipulated to stoke election chaos. Facebook and Twitter took steps last week to clamp down on false information before and after the vote. Facebook banned groups and posts related to the pro-Trump conspiracy movement QAnon and said it would suspend political advertising postelection. Twitter said it was changing some basic features to slow the way information flows on its network.

On Friday, Twitter executives urged people “to recognize our collective responsibility to the electorate to guarantee a safe, fair and legitimate democratic process this November.”

Trey Grayson, a Republican former secretary of state of Kentucky and a member of the Transition Integrity Project, said the idea that the group was preparing a left-wing coup was “crazy.” He said the group had explored many election scenarios, including a victory by Mr. Trump.

Michael Anton, a former national security adviser to President Trump, also published an essay on Sept. 4 in the conservative publication The American Mind, claiming, “Democrats are laying the groundwork for revolution right in front of our eyes.”

His article was the tipping point for the coup claim. It was posted more than 500 times on Facebook and reached 4.9 million people, according to CrowdTangle, a Facebook-owned analytics tool. Right-wing news sites such as The Federalist and DJHJ Media ramped up coverage of the idea, as did Mr. Bongino.

Mr. Anton did not respond to a call for comment.

The lie also began metastasizing. In one version, right-wing commentators claimed, without proof, that Mr. Biden would not concede if he lost the election. They also said his supporters would riot.

“If a defeated Biden does not concede and his party’s rioters take to the streets in a coup attempt against President Trump, will the military be needed to stop them?” tweeted Mr. Levin, the Fox News host, on Sept. 18. His message was shared nearly 16,000 times.

After The Times contacted him, Mr. Levin published a note on Facebook saying his tweet had been a “sarcastic response to the Democrats.”

Bill Russo, a spokesman for the Biden campaign, said in a statement that Mr. Biden would accept how the people voted. “Donald Trump and Mike Pence are the ones who refuse to commit to a peaceful transfer of power,” he said.

On YouTube, dozens of videos pushing the false coup narrative have collectively gathered more than 1.2 million views since Sept. 7, according to a tally by The Times. One video was titled, “RED ALERT: Are the President’s Enemies Preparing a COUP?”

The risk of misinformation translating to real-world action is growing, said Mike Caulfield, a digital literacy expert at Washington State University Vancouver.

“What we’ve seen over the past four years is an increasing capability” from believers to turn these conspiracy narratives “into direct physical actions,” he said.

Ben Decker contributed research.