Posted on

YouTube Cracks Down on QAnon Conspiracy Theory

YouTube on Thursday became the latest social media giant to take steps to stop QAnon, the sprawling pro-Trump conspiracy theory community whose online fantasies about a cabal of satanic pedophiles running the world have spilled over into offline violence.

The company announced in a blog post that it was updating its hate speech and harassment policies to prohibit “content that targets an individual or group with conspiracy theories that have been used to justify real-world violence.” The new policy will prohibit content promoting QAnon, as well as related conspiracy theories such as Pizzagate, which falsely claims that top Democrats and Hollywood elites are running an underground sex-trafficking ring from the basement of a Washington pizza restaurant.

Other social networks have also taken steps to curb the spread of QAnon, which has been linked to incidents of violence and vandalism. Last week, Facebook hardened its rules related to QAnon content and compared it to a “militarized social movement” that was becoming increasingly violent. This week, several smaller platforms, including Pinterest, Etsy and Triller, also announced new restrictions on QAnon content.

Under YouTube’s new policy, which goes into effect today, “content that threatens or harasses someone by suggesting they are complicit” in a harmful theory like QAnon or Pizzagate will be banned. News coverage of these theories and videos that discuss the theories without targeting individuals or groups may still be allowed.

The QAnon movement began in 2017, when an anonymous poster under the handle “Q Clearance Patriot,” or “Q,” began posting cryptic messages on 4chan, the notoriously toxic message board, claiming to possess classified information about a secret battle between President Trump and a global cabal of pedophiles. QAnon believers — known as “bakers” — began discussing and decoding them in real time on platforms including Reddit and Twitter, connecting the dots on a modern rebranding of centuries-old anti-Semitic tropes that falsely accused prominent Democrats, including Hillary Clinton and the liberal financier George Soros, of pulling the strings on a global sex trafficking conspiracy.

Few platforms played a bigger role in moving QAnon from the fringes to the mainstream than YouTube. In the movement’s early days, QAnon followers produced YouTube documentaries that offered an introductory crash course in the movement’s core beliefs. The videos were posted on Facebook and other platforms, and were often used to draw new recruits. Some were viewed millions of times.

QAnon followers also started YouTube talk shows to discuss new developments related to the theory. Some of these channels amassed large audiences and made their owners prominent voices within the movement.

“YouTube has a huge role in the Q mythology,” said Mike Rothschild, a conspiracy theory debunker who is writing a book about QAnon. “There are major figures in the Q world who make videos on a daily basis, getting hundreds of thousands of views and packaging their theories in slick clips that are a world away from the straight-to-camera rambles so prominent in conspiracy theory video making.”

YouTube has tried for years to curb the spread of misinformation and conspiracy theories on its platform, and tweak the recommendations algorithm that was sending millions of viewers to what it considered low-quality content. In 2019, the company began to demote what it called “borderline content” — videos that tested its rules, but didn’t quite break them outright — and reduce the visibility of those videos in search results and recommendations.

The company says that these changes have decreased by more than 70 percent the number of views borderline content gets from recommendations, although that figure cannot be independently verified. YouTube also says that among a set of pro-QAnon channels, the number of views coming from recommendations dropped by more than 80 percent following the 2019 policy change.

Social media platforms have been under scrutiny for their policy decisions in recent weeks, as Democrats accuse them of doing too little to stop the spread of right-wing misinformation, and Republicans, including President Trump, paint them as censorious menaces to free speech.

YouTube, which is owned by Google, has thus far stayed mostly out of the political fray despite the platform’s enormous popularity — users watch more than a billion hours of YouTube videos every day — and the surfeit of misinformation and conspiracy theories on the service. Its chief executive, Susan Wojcicki, has not been personally attacked by Mr. Trump or had to testify to Congress, unlike Jack Dorsey of Twitter and Mark Zuckerberg of Facebook.

Vanita Gupta, the chief executive of the Leadership Conference on Civil and Human Rights, a coalition of civil rights groups, praised YouTube’s move to crack down on QAnon content.

“We commend YouTube for banning this harmful and hateful content that targets people with conspiracy theories used to justify violence offline, particularly through efforts like QAnon,” Ms. Gupta said. “This online content can result in real-world violence, and fosters hate that harms entire communities.”

Mr. Rothschild, the QAnon researcher, predicted that QAnon believers who were kicked off YouTube would find ways to distribute their videos through smaller platforms. He also cautioned that the movement’s followers were known for trying to evade platform bans, and that YouTube would have to remain vigilant to keep them from restarting their channels and trying again.

“YouTube banning Q videos and suspending Q promoters is a good step,” he said, “but it won’t be the end of Q. Nothing has been so far.”

Read More

Posted on

How to Deal With a Crisis of Misinformation

There’s a disease that has been spreading for years now. Like any resilient virus, it evolves to find new ways to attack us. It’s not in our bodies, but on the web.

It has different names: misinformation, disinformation or distortions. Whatever the label, it can be harmful, especially now that it is being produced through the lens of several emotionally charged events: the coronavirus pandemic, a presidential election and protests against law enforcement.

The swarm of bad information circulating on the web has been intense enough to overwhelm Alan Duke, the editor of Lead Stories, a fact-checking website. For years, he said, false news mostly consisted of phony web articles that revolved around silly themes, like myths about putting onions in your socks to cure a cold. But misinformation has now crept into much darker, sinister corners and taken on forms like the internet meme, which is often a screenshot overlaid with sensational text or manipulated with doctored images.

He named a harmful example of memes: Those attacking Breonna Taylor, the Black medical worker in Louisville, Ky., who was killed by the police when they entered her home in March. Misinformation spreaders generated memes suggesting that Ms. Taylor shot at police officers first, which was not true.

“The meme is probably the most dangerous,” Mr. Duke said. “In seven or 20 words, somebody can say something that’s not true, and people will believe it and share it. It takes two minutes to create.”

It’s impossible to quantify how much bad information is out there now because the spread of it online has been relentless. Katy Byron, who leads a media literacy program at the Poynter Institute, a journalism nonprofit, and who works with a group of teenagers who regularly track false information, said it was on the rise. Before the pandemic, the group would present a few examples of misinformation every few days. Now each student is reporting multiple examples a day.

“With the pandemic, people are increasingly online doomscrolling and looking for information,” Ms. Byron said. “It’s getting harder and harder to find it and feel confident you’re consuming facts.”

The misinformation, she said, is also creeping into videos. With modern editing tools, it has become too easy for people with little technical know-how and minimal equipment to produce videos that appear to have high production value. Often, real video clips are stripped of context and spliced together to tell a different story.

The rise of false news is bad news for all of us. Misinformation can be a detriment to our well-being in a time when people are desperately seeking information such as health guidelines to share with their loved ones about the coronavirus. It can also stoke anger and cause us to commit violence. Also important: It could mislead us about voting in a pandemic that has turned our world upside down.

How do we adapt to avoid being manipulated and spreading false information to the people we care about? Past methods of spotting untruthful news, like checking articles for typos and phony web addresses that resemble those of trusted publications, are now less relevant. We have to employ more sophisticated methods of consuming information, like doing our own fact-checking and choosing reliable news sources.

Here’s what we can do.

Get used to this keyboard shortcut: Ctrl+T (or Command+T on a Mac). That creates a new browser tab in Chrome and Firefox. You’re going to be using it a lot. The reason: It enables you to ask questions and hopefully get some answers with a quick web search.

It’s all part of an exercise that Ms. Byron calls lateral reading. While reading an article, Step 1 is to open a browser tab. Step 2 is to ask yourself these questions:

  • Who is behind the information?

  • What is the evidence?

  • What do other sources say?

From there, with that new browser tab open, you could start answering those questions. You could do a web search on the author of the content when possible. You could do another search to see what other publications are saying about the same topic. If the claim isn’t being repeated elsewhere, it may be false.

You could also open another browser tab to look at the evidence. With a meme, for example, you could do a reverse image search on the photo that was used in the meme. On, click Images and upload the photo or paste the web address of the photo into the search bar. That will show where else the image has shown up on the web to verify whether the one you have seen has been manipulated.

With videos, it’s trickier. A browser plug-in called InVID can be installed on Firefox and Chrome. When watching a video, you can click on the tool, click on the Keyframes button and paste in a video link (a YouTube clip, for example) and click Submit. From there, the tool will pull up important frames of the video, and you can reverse image search on those frames to see if they are legitimate or fake.

Some of the tech steps above may not be for the faint of heart. But most important is the broader lesson: Take a moment to think.

“The No. 1 rule is to slow down, pause and ask yourself, ‘Am I sure enough about this that I should share it?’” said Peter Adams, a senior vice president of the News Literacy Project, a media education nonprofit. “If everybody did that, we’d see a dramatic reduction of misinformation online.”

While social media sites like Facebook and Twitter help us stay connected with the people we care about, there’s a downside: Even the people we trust may be unknowingly spreading false information, so we can be caught off guard. And with everything mashed together into a single social media feed, it gets tougher to distinguish good information from bad information, and fact from opinion.

What we can do is another exercise in mindfulness: Be deliberate about where you get your information, Mr. Adams said. Instead of relying solely on the information showing up in your social media feeds, choose a set of publications that you trust, like a newspaper, a magazine or a broadcast news program, and turn to those regularly.

Mainstream media is far from perfect, but it’s subjected to a standards process that is usually not seen in user-generated content, including memes.

“A lot of people fall into the trap of thinking no source of information is perfect,” Mr. Adams said. “That’s when people really start to feel lost and overwhelmed and open themselves up to sources they really should stay away from.”

The most frightening part about misinformation is when it transcends digital media and finds its way into the real world.

Mr. Duke of Lead Stories said he and his wife had recently witnessed protesters holding signs with the message “#SavetheChildren.” The signs alluded to a false rumor spread by supporters of the QAnon conspiracy about a child-trafficking network led by top Democrats and Hollywood elites. The pro-Trump conspiracy movement had effectively hijacked the child-trafficking issue, mixing facts with its own fictions to suit its narrative.

Conspiracy theories have fueled some QAnon believers to be arrested in cases of serious crimes, including a murder in New York and a conspiracy to kidnap a child.

“QAnon has gone from misinformation online to being out on the street corner,” he said. “That’s why I think it’s dangerous.”

Posted on

Riled Up: Misinformation Stokes Calls for Violence on Election Day

In a video posted to Facebook on Sept. 14, Dan Bongino, a popular right-wing commentator and radio host, declared that Democrats were planning a coup against President Trump on Election Day.

For just over 11 minutes, Mr. Bongino talked about how bipartisan election experts who had met in June to plan for what might happen after people vote were actually holding exercises for such a coup. To support his baseless claim, he twisted the group’s words to fit his meaning.

“I want to warn you that this stuff is intense,” Mr. Bongino said, speaking into the camera to his 3.6 million Facebook followers. “Really intense, and you need to be ready to digest it all.”

His video, which has been viewed 2.9 million times, provoked strong reactions. One commenter wrote that people should be prepared for when Democrats “cross the line” so they could “show them what true freedom is.” Another posted a meme of a Rottweiler about to pounce, with the caption, “Veterans be like … Say when Americans.”


The coup falsehood was just one piece of misinformation that has gone viral in right-wing circles ahead of Election Day on Nov. 3. In another unsubstantiated rumor that is circulating on Facebook and Twitter, a secret network of elites was planning to destroy the ballots of those who voted for President Trump. And in yet another fabrication, supporters of Mr. Trump said that an elite cabal planned to block them from entering polling locations on Election Day.

All of the rumors appeared to be having the same effect: Of riling up Mr. Trump’s restive base, just as the president has publicly stoked the idea of election chaos. In comment after comment about the falsehoods, respondents said the only way to stop violence from the left was to respond in kind with force.

“Liberals and their propaganda,” one commenter wrote. “Bring that nonsense to country folks who literally sit in wait for days to pull a trigger.”

The misinformation, which has been amplified by right-wing media such as the Fox News host Mark Levin and outlets like Breitbart and The Daily Wire, adds contentiousness to an already powder-keg campaign season. Mr. Trump has repeatedly declined to say whether he would accept a peaceful transfer of power if he lost to his Democratic challenger, Joseph R. Biden Jr., and has urged his supporters “to go into the polls and watch very carefully.”

The falsehoods on social media are building support for the idea of disrupting the election. Election officials have said they fear voter harassment and intimidation on Election Day.



“This is extremely concerning,” said Megan Squire, a computer science professor at Elon University in Elon, N.C., who tracks extremists online. Combined with Mr. Trump’s comments, the false rumors are “giving violent vigilantes an excuse” that acting out in real life would be “in defense of democracy,” she said.

Tim Murtaugh, a Trump campaign spokesman, said Mr. Trump would “accept the results of an election that is free, fair and without fraud” and added that the question of violence was “better put to Democrats.”

Keep up with Election 2020

In a text message, Mr. Bongino said the idea of a Democratic coup was “not a rumor” and that he was busy “exposing LIBERAL violence.”

Distorted information about the election is also flowing in left-wing circles online, though to a lesser degree, according to a New York Times analysis. Such misinformation includes a viral falsehood that mailboxes were being blocked by unknown actors to effectively discourage people from voting.

Other popular leftist sites, like Liberal Blogger and The Other 98%, have also twisted facts to push a critical narrative about Republicans, according to PolitiFact, a fact-checking website. In one inflammatory claim last week, for instance, the left-wing Facebook page Occupy Democrats asserted that President Trump had directly inspired a plot by a right-wing group to kidnap Gov. Gretchen Whitmer of Michigan.

Social media companies appear increasingly alarmed by how their platforms may be manipulated to stoke election chaos. Facebook and Twitter took steps last week to clamp down on false information before and after the vote. Facebook banned groups and posts related to the pro-Trump conspiracy movement QAnon and said it would suspend political advertising postelection. Twitter said it was changing some basic features to slow the way information flows on its network.

On Friday, Twitter executives urged people “to recognize our collective responsibility to the electorate to guarantee a safe, fair and legitimate democratic process this November.”

Trey Grayson, a Republican former secretary of state of Kentucky and a member of the Transition Integrity Project, said the idea that the group was preparing a left-wing coup was “crazy.” He said the group had explored many election scenarios, including a victory by Mr. Trump.

Michael Anton, a former national security adviser to President Trump, also published an essay on Sept. 4 in the conservative publication The American Mind, claiming, “Democrats are laying the groundwork for revolution right in front of our eyes.”

His article was the tipping point for the coup claim. It was posted more than 500 times on Facebook and reached 4.9 million people, according to CrowdTangle, a Facebook-owned analytics tool. Right-wing news sites such as The Federalist and DJHJ Media ramped up coverage of the idea, as did Mr. Bongino.

Mr. Anton did not respond to a call for comment.

The lie also began metastasizing. In one version, right-wing commentators claimed, without proof, that Mr. Biden would not concede if he lost the election. They also said his supporters would riot.

“If a defeated Biden does not concede and his party’s rioters take to the streets in a coup attempt against President Trump, will the military be needed to stop them?” tweeted Mr. Levin, the Fox News host, on Sept. 18. His message was shared nearly 16,000 times.

After The Times contacted him, Mr. Levin published a note on Facebook saying his tweet had been a “sarcastic response to the Democrats.”

Bill Russo, a spokesman for the Biden campaign, said in a statement that Mr. Biden would accept how the people voted. “Donald Trump and Mike Pence are the ones who refuse to commit to a peaceful transfer of power,” he said.

On YouTube, dozens of videos pushing the false coup narrative have collectively gathered more than 1.2 million views since Sept. 7, according to a tally by The Times. One video was titled, “RED ALERT: Are the President’s Enemies Preparing a COUP?”

The risk of misinformation translating to real-world action is growing, said Mike Caulfield, a digital literacy expert at Washington State University Vancouver.

“What we’ve seen over the past four years is an increasing capability” from believers to turn these conspiracy narratives “into direct physical actions,” he said.

Ben Decker contributed research.

Posted on

Facebook Amps Up Its Crackdown on QAnon

Facebook, facing criticism that it hasn’t done enough to curb a fast-growing, fringe conspiracy movement, said on Tuesday that it would remove any group, page or Instagram account that openly identified with QAnon.

The change drastically hardens earlier policies outlined by the social media company. In August, Facebook unveiled its first attempt to limit the spread of QAnon, by establishing policies that barred QAnon groups that called for violence.

But hundreds of other QAnon groups and pages continued to spread on the platform, and the effort was considered a disappointment in many circles, including among Facebook employees.

On Tuesday, Facebook acknowledged that its previous policies had not gone far enough in addressing the popularity of the far-right conspiracy movement.

“We’ve been vigilant in enforcing our policy and studying its impact on the platform but we’ve seen several issues that led to today’s update,” Facebook said in a public post.

Since Facebook’s initial ban, QAnon followers had found ways to evade the rules. The group dates back to October 2017, but has experienced its largest increase in followers since the start of the pandemic.

At its core, QAnon is a sprawling movement that believes, falsely, that the world is run by a cabal of Satan-worshiping pedophiles who are plotting against President Trump. It has branched into a number of other conspiracies, including casting doubt on medical advice for dealing with the pandemic, like wearing masks.

On Facebook, QAnon has attracted new followers by adopting tactics such as renaming groups and toning down the messaging to make it seem less jarring. A campaign by QAnon to co-opt health and wellness groups as well as discussions about child safety drew thousands of new people into its conspiracies in recent months.

Researchers who study the group said that QAnon’s shifting tactics had initially helped it skirt Facebook’s new rules, but that the policies announced on Tuesday were likely to tighten the screws on the conspiracists.

“Facebook has been instrumental in the growth of QAnon. I’m surprised it has taken the company this long to take this type of action,” said Travis View, a host of “QAnon Anonymous,” a podcast that seeks to explain the movement.

Since QAnon has become a key source of misinformation on a number of topics, Mr. View said, the action announced by Facebook is likely to have a far-reaching impact in “slowing the spread of misinformation on Facebook and more generally across social media.”

Nearly 100 Facebook groups and pages, some with tens of thousands of followers, have already been affected by the changes, according to a survey conducted by The New York Times using Crowdtangle, a Facebook-owned analytics tool.

Facebook said that it had begun to enforce the changes on Tuesday, and that it would take a more proactive approach to finding and removing QAnon content, rather than relying on people to report content.

Posted on

Facebook Tried to Limit QAnon. It Failed.

OAKLAND, Calif. — Last month, Facebook said it was cracking down on activity tied to QAnon, a vast conspiracy theory that falsely claims that a satanic cabal runs the world, as well as other potentially violent extremist movements.

Since then, a militia movement on Facebook that called for armed conflict on the streets of U.S. cities has gained thousands of new followers. A QAnon Facebook group has also added hundreds of new followers while questioning common-sense pandemic medical practices, like wearing a mask in public and staying at home while sick. And a campaign that claimed to raise awareness of human trafficking has steered hundreds of thousands of people to conspiracy theory groups and pages on the social network.

Perhaps the most jarring part? At times, Facebook’s own recommendation engine — the algorithm that surfaces content for people on the site — has pushed users toward the very groups that were discussing QAnon conspiracies, according to research conducted by The New York Times, despite assurances from the company that that would not happen.

None of this was supposed to take place under new Facebook rules targeting QAnon and other extremist movements. The Silicon Valley company’s inability to quash extremist content, despite frequent flags from concerned users, is now renewing questions about the limits of its policing and whether it will be locked in an endless fight with QAnon and other groups that see it as a key battleground in their online war.

The stakes are high ahead of the Nov. 3 election. QAnon groups, which have cast President Trump as the hero in their baseless conspiracy, have spread and amplified misinformation surrounding the election. Among other things, they have shared false rumors that widespread voter fraud is already taking place and have raised questions about the competency of the Postal Service with mail-in ballots.

“In allowing QAnon groups to get to this point and continue to grow, Facebook has created a huge problem for themselves and for society in a more general sense,” said Travis View, a host of QAnon Anonymous, a podcast that seeks to explain the movement.

The QAnon movement has proved extremely adept at evading detection on Facebook under the platform’s new restrictions. Some groups have simply changed their names or avoided key terms that would set off alarm bells. The changes were subtle, like changing “Q” to “Cue” or to a name including the number 17, reflecting that Q is the 17th letter of the alphabet. Militia groups have changed their names to phrases from the Bible, or to claims of being “God’s Army.”

Others simply tweaked what they wrote to make it more palatable to the average person. Facebook communities that had otherwise remained insulated from the conspiracy theory, like yoga groups or parenting circles, were suddenly filled with QAnon content disguised as health and wellness advice or concern about child trafficking.

A Facebook spokeswoman said the company was continuing to evaluate its best practices. “Our specialists are working with external experts on ways to disrupt activity designed to evade our enforcement,” said the spokeswoman.

Facebook and other social media companies began taking action against the extremist groups this summer, prompted by rapid growth in QAnon and real-world violence linked to the group and militia-style movements on social media.

Twitter moved first. On July 21, Twitter announced that it was removing thousands of QAnon accounts and was blocking trends and key phrases related to the movement from appearing in its search and Trending Topics section. But many of the QAnon accounts on Twitter returned within weeks of the initial ban, according to researchers who study the platform.

In a statement on Thursday, Twitter said that impressions, or views, of QAnon content had dropped by 50 percent since it had rolled out its restrictions.

Then on Aug. 19, Facebook followed. The social network said it was removing 790 QAnon groups from its site and was introducing new rules to clamp down on movements that discuss “potential violence.” The effect would be to restrict groups, pages and accounts belonging to extremist groups, in the company’s most sweeping action against QAnon and other such groups that had used Facebook to call for violence.

About 100 QAnon groups on Facebook tracked by The Times in the month since the rules were instituted continued to grow at a combined pace of over 13,600 new followers a week, according to an analysis of data from CrowdTangle, a Facebook-owned analytics platform.

That was down from the period before the new restrictions, when the same groups added between 15,000 and 25,000 new members a week. Even so, it indicated that QAnon was still recruiting new followers.

Members of those groups were also more active than before. Comments, likes and posts within the QAnon groups grew to over 600,000 a week after Facebook’s rules went into effect, according to CrowdTangle data. Previous weeks had seen an average of less than 530,000 interactions a week.

“The groups, including QAnon, feel incredibly passionate about their cause and will do whatever they can do attract new people to their conspiracy movement. Meanwhile, Facebook has nowhere near the same type of urgency or mandate to contain them,” Mr. View said. “Facebook is operating with constraints and these extremist movements are not.”

Researchers who study QAnon said the movement’s continued growth was partly related to Facebook’s recommendation engine, which pushes people to join groups and pages related to the conspiracy theory.

Marc-André Argentino, a Ph.D. candidate at Concordia University who is studying QAnon, said he had identified 51 Facebook groups that branded themselves as anti-child trafficking organizations, but which were actually predominantly sharing QAnon conspiracies. Many of the groups, which were formed at the start of 2020, spiked in growth in the weeks after Facebook and Twitter began enforcing new bans on QAnon.

The groups previously added dozens to hundreds of new members each week. Following the bans, they attracted tens of thousands of new members weekly, according to data published by Mr. Argentino.

Facebook said it was studying the groups, but has not taken action on them.

The company is increasingly facing criticism, including from Hollywood celebrities and civic rights groups. On Wednesday, celebrities including Kim Kardashian West, Katy Perry and Mark Ruffalo said they were freezing their Instagram accounts for 24 hours to protest Facebook’s policies. (Instagram is owned by Facebook.)

The Anti-Defamation League also said it was pressing Facebook to take action on militia groups and other extremist organizations. “We have been warning Facebook safety teams literally for years about the problem of dangerous and potentially violent extremists using their products to organize and to recruit followers,” Jonathan Greenblatt, the chief executive of the A.D.L., said.

The A.D.L., which has been meeting with Facebook for months about its concerns, has publicly posted lists of hate groups and conspiracy organizations present on the social network. David L. Sifry, the vice president of A.D.L.’s Center for Technology and Society, said that the A.D.L. has had similar conversations about extremist content with other platforms like Twitter, Reddit, TikTok and YouTube, which have been more receptive.

“The response we get back is markedly different with Facebook,” he said. “There are people of good conscience at every single one of these platforms. The core difference is leadership.”

Sheera Frenkel reported from Oakland, Calif., and Tiffany Hsu from Hoboken, N.J. Davey Alba contributed reporting from New York and Ben Decker from Boston.

Posted on

What Is QAnon, the Viral Pro-Trump Conspiracy Theory?

If you’re spending a lot of time online these days — and thanks to the pandemic, many of us are — you’ve probably heard of QAnon, the sprawling internet conspiracy theory that has taken hold among some of President Trump’s supporters.

But unless you’re very online, you likely still have questions about what exactly is going on.

QAnon was once a fringe phenomenon — the kind most people could safely ignore. But in recent months, it’s gone mainstream. Twitter, Facebook and other social networks have been flooded with QAnon-related false information about Covid-19, the Black Lives Matter protests and the 2020 election. QAnon supporters have also been trying to attach themselves to other activist causes, such as the anti-vaccine and anti-child-trafficking movements, in an effort to expand their ranks.

QAnon has also seeped into the offline world, with some believers charged with violent crimes, including one QAnon follower accused of murdering a mafia boss in New York last year and another who was arrested in April and accused of threatening to kill Joseph R. Biden Jr., the presumptive Democratic presidential nominee. The Federal Bureau of Investigation has warned that QAnon poses a potential domestic terror threat.

Last week, QAnon reached a new milestone when Marjorie Taylor Greene, an avowed QAnon supporter from Georgia, won a Republican primary in a heavily conservative district, setting her up for a near-certain election to Congress in November. After Ms. Greene’s win, Mr. Trump called her a “future Republican star.”

QAnon is an incredibly convoluted theory, and you could fill an entire book explaining its various tributaries and sub-theories. But here are some basic things you should know.

QAnon is the umbrella term for a sprawling set of internet conspiracy theories that allege, falsely, that the world is run by a cabal of Satan-worshiping pedophiles who are plotting against Mr. Trump while operating a global child sex-trafficking ring.

QAnon followers believe that this clique includes top Democrats including Hillary Clinton, Barack Obama and George Soros, as well as a number of entertainers and Hollywood celebrities like Oprah Winfrey, Tom Hanks, Ellen DeGeneres and religious figures including Pope Francis and the Dalai Lama. Many of them also believe that, in addition to molesting children, members of this group kill and eat their victims in order to extract a life-extending chemical from their blood.

According to QAnon lore, Mr. Trump was recruited by top military generals to run for president in 2016 in order to break up this criminal conspiracy, end its control of politics and the media, and bring its members to justice.

Not by a long shot. Since it began, QAnon has incorporated elements of many other conspiracy theory communities, including claims about the assassination of John F. Kennedy, the existence of U.F.O.s, and the 9/11 “truther” movement.

QAnon Anonymous, a podcast about the QAnon movement, calls QAnon a “big tent conspiracy theory” because it is constantly evolving and adding new features and claims. But the existence of a global pedophile cabal is the core tenet of QAnon, and the one that most, if not all, of its followers believe.

In October 2017, a post appeared on 4chan, the notoriously toxic message board, from an anonymous account calling itself “Q Clearance Patriot.” This poster, who became known simply as “Q,” claimed to be a high-ranking intelligence officer with access to classified information about Mr. Trump’s war against the global cabal.

Q predicted that this war would soon culminate in “The Storm” — an appointed time when Mr. Trump would finally unmask the cabal, punish its members for their crimes and restore America to greatness.

Credit…Pool photo by Andrew Harrer

It’s a reference to a cryptic remark Mr. Trump made during an October 2017 photo op. Posing alongside military generals, Mr. Trump said, “You guys know what this represents? Maybe it’s the calm before the storm.”

QAnon believers pointed to this moment as proof that Mr. Trump was sending coded messages about his plans to break up the global cabal, with the help of the military.

Q’s identity is still unknown, although there have been hints and speculation about it for years. Some speculate that a single internet troll has been posting as Q the entire time; others say that multiple people are involved in posting as Q, or that Q’s identity has changed over time.

Making things more complicated is that Q’s online home base has changed several times. Q’s posts originally appeared on 4chan. Then they moved to 8chan, where they stayed until that site was taken offline last year after the El Paso mass shooting. They now live on 8kun, a site run by the former owner of 8chan. Each of these sites uses a system of identity verification known as a “tripcode” — essentially, a unique digital signature that proves that a series of anonymous posts were written by the same person or people.

“Drops” are what QAnon followers call Q’s posts. There have been nearly 5,000 of them so far, and most take the form of a cryptic coded message.

Here’s an example of a Q drop from September 2018:


[LL] talking = TRUTH reveal TARMAC [BC]?

[LL] talking = TRUTH reveal COMEY HRC EMAIL CASE?

[LL] talking = TRUTH reveal HUSSEIN instructions re: HRC EMAIL CASE?




In this post, you can see coded references to “LL” (Loretta Lynch, President Obama’s former attorney general), “BC” (Bill Clinton), “HRC” (Hillary Rodham Clinton), and “HUSSEIN” (President Obama), along with references to John Brennan, the former director of the Central Intelligence Agency, the Foreign Intelligence Surveillance Act, and “POTUS” — President Trump.

Many QAnon followers use “Q Drop” apps that collect all of Q’s posts in one place, and alert them every time a new post arrives. (One of these apps hit the top 10 paid apps in Apple’s App Store before it was pulled down for violating the company’s guidelines.) They then post these drops in Facebook groups, chat rooms for the Discord chat app and Twitter threads, and begin discussing and debating what it all means.

Yes and no. QAnon has been described as a “big-budget sequel” to Pizzagate, because it takes the original Pizzagate conspiracy theory — which alleged, falsely, that Mrs. Clinton and her cronies were operating a child sex-trafficking ring out of the basement of a Washington, D.C., pizza restaurant — and adds many more layers of narrative on top of it. But many people believe in both theories, and for many QAnon believers, Pizzagate represented a kind of conspiracy theory on-ramp.

One new element in QAnon is a number of clear and specific predictions about when and how “The Storm” would play out. For years, Q has predicted that mass arrests of cabal members would occur on certain days, that certain government reports would reveal the cabal’s misdeeds, and that Republicans would win numerous seats in the 2018 midterm elections.

None of those predictions came true. But most QAnon believers didn’t care. They simply found ways to reframe the narrative and ignore the discrepancies, and moved on.

It’s hard to say, because there’s no official membership directory, but the number is not small. Even if you count only the hard-core QAnon believers — excluding “QAnon-lite” adherents who might believe in a deep state plot against Mr. Trump, but not a cabal of child-eating Satanists — the number may be at least in the hundreds of thousands.

Some of the most popular QAnon groups on Facebook have more than 100,000 members apiece, and Twitter recently announced it was taking actions to limit the reach of more than 150,000 QAnon-associated accounts. A recent report by NBC News found that Facebook had conducted an internal study of QAnon’s presence on its platform, and it concluded that there were thousands of QAnon groups, with millions of members between them.

That number has probably grown during the pandemic, as people stuck indoors turn to the internet for entertainment and socializing and wind up being pulled into the QAnon community. A recent article in The Wall Street Journal found that membership in 10 large Facebook groups devoted to QAnon had grown by more than 600 percent since the start of lockdowns.

A common misconception is that QAnon is purely a political movement. But it functions, for people who believe in it, as both a social community and a source of entertainment.

Some people have compared QAnon to a massive multiplayer online game, because of the way it invites participants to cocreate a kind of shared reality filled with recurring characters, shifting story lines and intricate puzzle-solving quests. QAnon has also been compared to a church, in that it provides its followers with a social support structure as well as an organizing narrative for their everyday lives.

Adrian Hon, a game designer who has written about QAnon’s similarity to alternate-reality games, says that believers “open a fascinating fantasy world of secret wars and cabals and Hillary Clinton controlling things, and it offers convenient explanations for things that feel inexplicable or wrong about the world.”

Even though Q’s posts appear on fringe message boards, the QAnon phenomenon owes much of its popularity to Twitter, Facebook and YouTube, which have amplified QAnon messages and recommended QAnon groups and pages to new people through their algorithms.

In addition, QAnon believers have used social media to harass, intimidate and threaten their perceived enemies, and to seed other types of misinformation that wind up influencing public debate. Several of the most popular conspiracy theories on the internet this year — such as “Plandemic,” a documentary containing false and dangerous claims about Covid-19, and a viral conspiracy theory that falsely claimed that Wayfair, the online furniture company, was trafficking children — have been amplified and popularized by QAnon followers.

Some of these networks have started trying to remove QAnon content from their platforms. Twitter recently banned thousands of QAnon accounts, saying they had engaged in coordinated harassment. Facebook is reportedly coming up with its own QAnon containment strategy. But these interventions may be too little, too late.

It’s true that much of QAnon’s subject matter is recycled from earlier conspiracy theories. But QAnon is fundamentally an internet-based movement that operates in a different way, and at a different scale, than anything we’ve seen before.

For starters, QAnon is deeply participatory, in a way that few other popular conspiracy theories have been. Followers congregate in chat rooms and Facebook groups to decode the latest Q posts, discuss their theories about the news of the day, and bond with their fellow believers. The Atlantic has called it “the birth of a new religion.”

There’s also the basic danger of what QAnon followers actually believe. It’s one thing to have a polarized political discourse with heated disagreements; it’s another to have a faction of Americans who think, with complete sincerity, that the leaders of the opposition party are kidnapping and cannibalizing innocent children.

Combine those violent, paranoid fantasies with the fact that QAnon followers have been charged with committing serious crimes in Q’s name, and it’s no wonder people are worried.

Mr. Trump is the central and heroic figure in QAnon’s core narrative — the brave patriot who was chosen to save America from the global cabal. As a result, QAnon believers parse Mr. Trump’s words and actions closely, looking for hidden meanings. When Mr. Trump says the number 17, they take it as a sign that he is sending secret messages to them. (Q is the 17th letter of the alphabet.) When he wears a pink tie, they interpret it as a sign that he is freeing trafficked children. (Some hospitals use “code pink” as a shorthand for a child abduction in progress.)

Mr. Trump has never directly addressed QAnon, but he recently declined to denounce or disavow the movement when asked about his support for Ms. Green, the QAnon-affiliated congressional candidate. And he has shared posts from QAnon followers dozens of times on his social media accounts.


Credit…Stephanie Keith/Reuters

Yes. For months, QAnon followers have been hijacking #SaveTheChildren — which started out as a fund-raising campaign for a legitimate anti-child-trafficking organization — as a recruiting tactic.

What they’re doing, basically, is using false and exaggerated claims about child trafficking to attract the attention of a new audience — in this case, worried parents. Then, they attempt to steer the conversation to QAnon talking points — saying that the reason children are being trafficked, for example, is because the global cabal wants to harvest a supposedly life-extending chemical from their blood.

This particular tactic has been especially problematic for legitimate anti-trafficking groups, who have had to deal with clogged hotlines and rampant misinformation as QAnon has latched on to their issue.

Merely posting #SaveTheChildren doesn’t mean your friends are QAnon believers. They could have just stumbled on a post about child trafficking that resonated with them and decided to share it. But they, and you, should know that those posts are part of a concerted QAnon strategy.

Read More

Posted on

Think QAnon Is on the Fringe? So Was the Tea Party

Democrats dismissed it as a fringe group of conspiracy-minded zealots. Moderate Republicans fretted over its potential to hurt their party’s image, while more conservative lawmakers carefully sought to harness its grass roots energy. Sympathetic media outlets covered its rallies, portraying it as an emerging strain of populist politics — a protest movement born of frustration with a corrupt, unaccountable elite.

Then, to everyone’s surprise, its supporters started winning elections.

That is a description of the Tea Party movement, which emerged in 2009 from the right-wing fringes and proceeded to become a major, enduring force in American conservatism.

But it could just as easily be a description of QAnon, the pro-Trump conspiracy theory that has emerged as a possible inheritor to the Tea Party’s mantle as the most potent grass roots force in right-wing politics.

This week, QAnon most likely got its first member of Congress: Marjorie Taylor Greene, a Republican from Georgia who won a primary runoff in a heavily Republican district on Tuesday. Ms. Greene has publicly supported QAnon, appearing on QAnon shows and espousing the movement’s unfounded belief that President Trump is on the verge of breaking up a shadowy cabal of Satan-worshipping pedophiles. Other QAnon-affiliated candidates have won primaries at the federal and state level, though few in districts as conservative as Ms. Greene’s.

QAnon, which draws its beliefs from the cryptic message board posts of an anonymous writer claiming to have access to high-level government intelligence, lacks the leadership structure and the dark-money connections of the early Tea Party. It also lacks realistic goals or anything resembling a coherent policy agenda. Its followers are internet vigilantes gripped by paranoid and violent revenge fantasies, not lower-my-taxes conservatives or opponents of the Affordable Care Act.

But following Ms. Greene’s primary win, some Washington insiders have begun to wonder if QAnon’s potential influence is being similarly underestimated. They worry that, just as the Tea Party gave cover to a racist “birther” movement that propelled conspiracy theories about President Barack Obama into the Republican mainstream, QAnon’s extreme views may prove difficult to contain.

Credit…David McNew/Getty Images

“They’re delusional to dismiss it as a powerless fringe,” said Steve Schmidt, a longtime G.O.P. strategist and campaign veteran who has become a Trump critic. “The Republican Party is becoming the home to an amalgam of conspiracy theorists, fringe players, extremists and white nationalists that is out in the open in a startling way.”

To be clear: QAnon’s ideas are far more extreme than the Tea Party’s ever were. Tea Party supporters objected to Wall Street bailouts and the growing federal deficit; QAnon adherents believe that Hillary Clinton and George Soros are drinking the blood of innocent children. While Tea Party supporters generally sought to oust their political opponents at the ballot box, QAnon supporters cheer for top Democrats to be either imprisoned at Guantánamo Bay or rounded up and executed.

But there are more parallels than you’d think, especially when it comes to how the political establishments of their times reacted to each group’s rise.

When the Tea Party emerged in early 2009, many commentators mocked the idea that it could ever achieve political power, calling it a “display of hysteria” by “frothing right-wingers.” Michael R. Bloomberg, then the mayor of New York, characterized the Tea Party as a passing fad, comparing it to the burst of support for Ross Perot’s 1992 presidential campaign. Republican party leaders took it more seriously, but they, too, seemed to think that they could harness its energy without indulging its more extreme elements.

Then, in January 2010, Scott Brown, a little-known Republican lawmaker from Massachusetts, won a Senate seat in a shock upset over his Democratic opponent, Martha Coakley, partly because of support from the Tea Party. And it became clear to members of both parties that they had been wrong to underestimate the Tea Party’s potential.


Credit…Bryce Vickmark for The New York Times

Today, pundits tend to portray QAnon as an extreme but marginal movement — a kind of John Birch Society for the 4chan age. And some polling has suggested that the movement remains broadly unpopular.

But QAnon followers have left the dark corners of the internet and established a large and growing presence on mainstream social media platforms. Twitter recently announced it was removing or limiting the visibility of more than 150,000 QAnon-related accounts, and NBC News reported this week that a Facebook internal investigation into QAnon’s presence on its platform found thousands of active QAnon groups and pages, with millions of followers among them.

Even after Ms. Greene’s primary victory this week, few lawmakers have acknowledged QAnon directly. (One Republican lawmaker, Representative Adam Kinzinger of Illinois, called it “a fabrication” that has “no place in Congress” on Wednesday.) But its followers have routinely used social media to push extreme views — including opposition to mask-wearing, false fears about child exploitation, and the “Spygate” conspiracy theory — into conservative media. At least one Fox News commentator has spoken approvingly of the movement. And dozens of QAnon candidates are running as anti-establishment outsiders in Republican primaries this year, just as Tea Party candidates did in the 2010 midterm elections.

The similarities between QAnon and the Tea Party aren’t just historical. Some of the same activists are involved in both movements, and organizations like the Tea Party Patriots have provided fodder for QAnon’s social media campaigns, such as a recent viral video of doctors making false claims about Covid-19.

One notable difference is that while the Tea Party gained influence during a period when Republicans were out of power, QAnon is growing during the Trump administration, with the president’s tacit blessing. On Wednesday, Mr. Trump congratulated Ms. Greene on her primary win, calling her a “future Republican star.” (He made no mention of the video in which she called Mr. Trump’s presidency a “once-in-a-lifetime opportunity to take this global cabal of Satan-worshipping pedophiles out.”)


Credit…Mike Stewart/Associated Press

Vanessa Williamson, a senior fellow at the Brookings Institution and co-author of “The Tea Party and the Remaking of Republican Conservatism,” said that QAnon represented, in some ways, an extension of the Tea Party’s skepticism of mainstream authorities.

“The movement of conspiratorial thinking to the center of the Republican Party isn’t totally new,” Ms. Williamson said. “But the centrality of that conspiratorial thinking was something striking about the Tea Party, and it’s something even more striking about QAnon.”

One advantage QAnon has over earlier insurgent movements is improved technology. John Birch Society members had to resort to pamphleteering and newspaper ads, and the Tea Party — which kicked off with a CNBC anchor’s televised rant — relied heavily on the existing conservative media apparatus to spread its message.

But QAnon is native to the internet, and moves at the speed of social media. Since 2017, QAnon followers have built out an impressive media ecosystem encompassing Facebook groups, YouTube channels and Discord servers. These spaces serve both as sources of news and virtual water-coolers where followers socialize, trade new theories and memes, and strategize about growing their ranks.

The other big difference, of course, is who’s in the Oval Office. Mr. Trump has not directly addressed QAnon, but he has conspicuously avoided denouncing it, and has shared dozens of posts from believers on his social media accounts.

Geoffrey Kabaservice, director of political studies at the Niskanen Center, a libertarian think tank, said that while QAnon would likely not take over the Republican Party as thoroughly as the Tea Party did in 2010, it could continue growing if top Republicans were unwilling or unable to contain it.

“It won’t naturally be flushed out of the system,” he said. “The Republican Party would have to take active steps to flush it out of the system. And that likely won’t happen under President Donald Trump.”

Bill Kristol, the conservative commentator and critic of Mr. Trump, was more skeptical about QAnon’s influence on the Republican Party. He pointed out that there had always been extreme outliers in both parties of Congress whose influence tended to be diluted by more moderate voices over time.

But that was in the pre-Trump era, he admitted. Who knew what QAnon might become, with a presidential stamp of approval?

“Trump’s embrace is what makes this different, and more worrisome,” Mr. Kristol said. “If Trump is the president, and he’s embracing this, are we so confident that it’s not the future?”

Read More

Posted on

Google, Facebook and Others Broaden Group to Secure U.S. Election

SAN FRANCISCO — Facebook, Google and other major tech companies said on Wednesday that they had added new partners and met with government agencies in their efforts to secure the November election.

The group, which is seeking to prevent the kind of online meddling and foreign interference that sullied the 2016 presidential election, previously consisted of some of the large social media firms, including Twitter and Microsoft in addition to Facebook and Google. Among the new participants is the Wikimedia Foundation.

The group met on Wednesday with representatives from agencies like the F.B.I., the Office of the director of National Intelligence and the Department of Homeland Security to share insights about disinformation campaigns and emerging deceptive behavior across their services.

Discussions between the tech companies and government agencies have occurred periodically over the past four years. While some of the companies have made a practice of sharing leads about disinformation campaigns and other election threats, the efforts have been haphazard. The effort has broadened as the November election approaches, and the tech companies and agencies have tried to coordinate more frequently.

“In preparation for the upcoming election, we regularly meet to discuss trends with U.S. government agencies tasked with protecting the integrity of the election,” a spokesman for the group said in a statement. “For the past several years, we have worked closely to counter information operations across our platforms.”

The group emerged from meetings that began between the tech companies and government agencies last fall. The companies have since taken action to ward off threats in elections around the world. Facebook, for instance, has monitored election behavior in Brazil, Mexico, Germany and France. Last year, the social network said it was strengthening how it verified which groups and people placed political advertising on its site.

At Wednesday’s meeting, the group and agencies updated one another on the behavior and illicit activities that the companies were seeing on their platforms.

“We discussed preparations for the upcoming conventions and scenario planning related to election results,” the group’s spokesman said. “We will continue to stay vigilant on these issues and meet regularly ahead of the November election.”

In addition to the Wikimedia Foundation, the group has expanded to involve LinkedIn, Pinterest, Reddit and Verizon Media. The government participants also include the Cybersecurity and Infrastructure Security Agency and the Department of Justice’s National Security Division.

Several social media companies have reported an increase in disinformation efforts as the election approaches. Last month, Twitter removed thousands of accounts that promoted the QAnon conspiracy theory. This week, NBC News reported that millions of QAnon conspiracy theory adherents were hidden in private groups and pages throughout Facebook.

The efficacy of the coalition remains unclear. While the group will discuss active threats, it is still the responsibility of each company to mitigate election interference on its platform.

Posted on

QAnon Followers Are Hijacking the #SaveTheChildren Movement

Recently, an acquaintance posted a photo on her Instagram story showing a map of the United States, filled with bright red dots.

“This is not a map of Covid,” the caption read. “It is a map of human trafficking.”

Under the photo was a hashtag: #SaveTheChildren.

A few days later, I saw the same hashtag trending on Twitter. This time, it was being posted by followers of QAnon, the sprawling pro-Trump conspiracy theory. These people were also disturbed about human trafficking, but with a dark twist: Many of them believed that President Trump was on the verge of exposing “Pizzagate” or “Pedogate,” their terms for a global conspiracy involving a ring of Satan-worshiping, child-molesting criminals led by prominent Democrats.

My acquaintance is not a QAnon believer. And she certainly doesn’t think, as some QAnon adherents do, that Hillary Clinton and her cronies are kidnapping and eating children (yes, eating them) in order to harvest a life-extending chemical from their blood.

But like many social media users in recent weeks, she had been drawn in by the latest QAnon outreach strategy.

QAnon first surfaced in 2017 with a series of anonymous posts on the internet forum 4chan claiming to reveal high-level government intelligence about crimes by top Democrats. It has since spawned one of the most disturbing and consequential conspiracy theory communities in modern history. Its followers have committed serious crimes, and its online vigilantes have made a sport of harassing and doxxing their perceived enemies. The F.B.I. has cited QAnon as a potential domestic terror threat, and social networks have begun trying to pull QAnon groups off their platforms. Dozens of QAnon-affiliated candidates are running for office this year. One of them, Marjorie Taylor Greene, won a primary runoff Tuesday for a House seat in Georgia, drawing a congratulatory tweet from Mr. Trump.

Like any movement, QAnon needs to win over new members. And its most recent growth strategy involves piggybacking on the anti-human-trafficking movement.

Credit…Jim Lo Scalzo/European Pressphoto Agency

The idea, in a nutshell, is to create a groundswell of concern by flooding social media with posts about human trafficking, joining parenting Facebook groups and glomming on to hashtag campaigns like #SaveTheChildren, which began as a legitimate fund-raising campaign for the Save the Children charity. Then followers can shift the conversation to baseless theories about who they believe is doing the trafficking: a cabal of nefarious elites that includes Tom Hanks, Oprah Winfrey and Pope Francis.

Part of the strategy’s perverse brilliance is that child sex trafficking is a real, horrible thing, and some politically connected people, including the financier Jeffrey Epstein, have been credibly accused of exploiting underage girls. And speaking out against child exploitation, no matter your politics, is far from an objectionable stance.

“It’s probably one of the key things that’s attractive about QAnon,” said Marc-André Argentino, a doctoral student at Concordia University who studies QAnon’s social media presence. “Everyone agrees that child trafficking is very bad, and the argument QAnon makes is, ‘If you’re against us talking about this, you’re in favor of child trafficking.’”

Sometimes, QAnon followers spin factual information in a way that serves their aims. Last week, an Associated Press article about a $35 million Trump administration grant to organizations that house trafficking survivors became one of the most-shared stories on Facebook, after QAnon groups picked it up and cited it as evidence that President Trump’s secret crusade against elite pedophiles was underway.

Other times, the strategy involves latching on to conspiracy theories and inserting QAnon talking points. Weeks ago, influencers on TikTok and Instagram began speculating about baseless allegations that Wayfair, an online furniture site, was trafficking children under the guise of selling expensive cabinets. The conspiracy theory went viral, and QAnon believers began sprinkling in their own supposedly incriminating details. They claimed, falsely, that a Wayfair employee had once been photographed with Ghislaine Maxwell, who has been charged with recruiting underage girls for Mr. Epstein.

These allegations merged in the popular imagination, and soon unsuspecting people were sharing wild conspiracy theories that came straight from QAnon orthodoxy.

“With Wayfair, both accounts on the left and right were amplifying the content,” Mr. Argentino said. “A lot of the yoga moms and juice-cleanse-type circles were sharing it.”

The strategy of seeding QAnon talking points with different audiences appears to be working. In recent weeks, Facebook engagement on human-trafficking-related content has surged, according to an analysis of data from CrowdTangle, a Facebook-owned data platform. (Interactions on posts with the #SaveTheChildren hashtag, for example, have grown more than 500 percent since early July.)

Prominent “mommy bloggers” and Instagram fitness influencers have begun posting anti-trafficking memes to their millions of followers. Even the Trump campaign has begun sharing more anti-trafficking content to its millions of Facebook and Twitter followers.

The QAnon strategy of pushing some unobjectionable, often factual content about human trafficking in addition to wild conspiracy theories has blurred the lines between legitimate anti-trafficking activism and partisan conspiracy mongering. Recently, some activists have marched in cities around the country demanding an end to child exploitation. Among them were QAnon believers, toting signs with messages like “Hollywood Eats Babies.”

For established anti-trafficking groups, the surge of support from internet conspiracy theorists has been a mixed blessing. Some activists, such as Tim Ballard, the founder of the anti-trafficking group Operation Underground Railroad, see an opportunity to reach a new, hyper-engaged online audience.

“Some of these theories have allowed people to open their eyes,” Mr. Ballard said. “So now it’s our job to flood the space with real information so the facts can be shared.”

Others worry that QAnon will divert valuable resources from legitimate groups trying to stop trafficking. After the Wayfair incident, the Polaris Project, a nonprofit organization that runs the National Human Trafficking Hotline, issued a news release saying its hotline had been overwhelmed with false reports. It later published a blog post warning that “unsubstantiated claims and accusations about child sex trafficking can spin out of control and mislead well-meaning people into doing more harm than good.”


Credit…John Taggart for The New York Times.

I spoke to a number of longtime anti-trafficking activists who were alarmed by QAnon’s recent incursion onto their turf. They had worked for years to expose facts about child trafficking, only to see them distorted and misused by partisan opportunists. And they worried that in addition to clogging hotlines, QAnon believers could undermine the movement’s bipartisan credibility.

Erin Williamson, the U.S. programs director for Love146, an anti-trafficking group, said that in the weeks after the Wayfair incident, the group’s social media traffic had spiked by 30 percent, and that new donations had come in. But it had also been forced to spend time debunking online rumors and myths.

“It’s great that we have an increase in donations,” Ms. Williamson said. “But we don’t want to exploit disinformation for fund-raising purposes.”

The truth about child sex trafficking, these experts told me, is much less salacious than QAnon would have you believe. Many victims are trafficked by relatives, teachers or other people they know. Trafficking usually doesn’t involve kidnapping or physically forcing minors into sex.

“This is not happening in some secret cabal. It’s happening in every single community,” said Lori Cohen, the executive director of ECPAT-USA, an anti-trafficking organization. “But it’s easier to focus on public figures than to think about the reality that trafficking is happening in our midst, among people we know, to children we know.”

Some anti-trafficking experts worried that social networks, in an attempt to clamp down on QAnon, might inadvertently hurt the legitimate organizations working to end trafficking. Recently, Facebook briefly disabled the #SaveTheChildren hashtag after it was flooded with pro-QAnon content. (A Facebook spokesman said: “We temporarily blocked the hashtag as it was surfacing low-quality content. The hashtag has since been restored, and we will continue to monitor for content that violates our community standards.”)

And TikTok has been blocking searches for QAnon-related hashtags. A TikTok spokeswoman said the company was “working to proactively remove misinformation that we find associated with that hashtag.”

Mostly, anti-trafficking activists are just incredulous that QAnon has made their cause its own.

“When I talk to my friends in the anti-trafficking movement, we’ll say, ‘Oh, it’s Pizzagate all over again,’” Ms. Williamson of Love146 said. “And this time, it’s even worse.’”

Posted on

Misleading Hydroxychloroquine Video, Pushed by the Trumps, Spreads Online

In a video posted Monday online, a group of people calling themselves “America’s Frontline Doctors” and wearing white medical coats spoke against the backdrop of the Supreme Court in Washington, sharing misleading claims about the virus, including that hydroxychloroquine was an effective coronavirus treatment and that masks did not slow the spread of the virus.

The video did not appear to be anything special. But within six hours, President Trump and his son Donald Trump Jr. had tweeted versions of it, and the right-wing news site Breitbart had shared it. It went viral, shared largely through Facebook groups dedicated to anti-vaccination movements and conspiracy theories such as QAnon, racking up tens of millions of views. Multiple versions of the video were uploaded to YouTube, and links were shared through Twitter.

Facebook, YouTube and Twitter worked feverishly to remove it, but by the time they had, the video had already become the latest example of misinformation about the virus that has spread widely.

That was because the video had been designed specifically to appeal to internet conspiracists and conservatives eager to see the economy reopen, with a setting and characters to lend authenticity. It showed that even as social media companies have sped up response time to remove dangerous virus misinformation within hours of its posting, people have continued to find new ways around the platforms’ safeguards.

“Misinformation about a deadly virus has become political fodder, which was then spread by many individuals who are trusted by their constituencies,” said Lisa Kaplan, founder of Alethea Group, a start-up that helps fight disinformation. “If just one person listened to anyone spreading these falsehoods and they subsequently took an action that caused others to catch, spread or even die from the virus — that is one person too many.”

One of the speakers in the video, who identified herself as Dr. Stella Immanuel, said, “You don’t need masks” to prevent spread of the coronavirus. She also claimed to be treating hundreds of patients infected with coronavirus with hydroxychloroquine, and asserted that it was an effective treatment. The claims have been repeatedly disputed by the medical establishment.

President Trump repeatedly promoted hydroxychloroquine, a malaria drug, in the early months of the crisis. In June, he said he was taking it himself. But that same month, the Food and Drug Administration revoked emergency authorization for the drug for Covid-19 patients and said it was “unlikely to be effective” and carried potential risks. The National Institutes of Health halted clinical trials of the drug.

In addition, studies have repeatedly shown that masks are effective in curbing the spread of the coronavirus.

The trajectory of Monday’s video mirrored that of “Plandemic,” a 26-minute slickly produced narration that spread widely in May and falsely claimed that a shadowy cabal of elites was using the virus and a potential vaccine to profit and gain power. In just over a week, “Plandemic” was viewed more than eight million times on YouTube, Facebook, Twitter and Instagram before it was taken down.

But the video posted Monday had more views than “Plandemic” within hours of being posted online, even though it was removed much faster. At least one version of the video, viewed by The Times on Facebook, was watched over 16 million times.

Facebook, YouTube, and Twitter deleted several versions of the video on Monday night. All three companies said the video violated their policies on sharing misinformation related to the coronavirus.

On Tuesday morning, Twitter also took action against Donald Trump Jr. after he shared a link to the video. A spokesman for Twitter said the company had ordered Mr. Trump to delete the misleading tweet and said it would “limit some account functionality for 12 hours.” Twitter took a similar action against Kelli Ward, the Arizona Republican Party chairwoman, who also tweeted the video.

No action was taken against the president, who retweeted multiple clips of the same video to his 84.2 million followers Monday night. The original posts have since been removed.

When asked about the video on Tuesday, Mr. Trump continued to defend the doctors involved and the treatments they are backing.

“For some reason the internet wanted to take them down and took them off,” the president said. “I think they are very respected doctors. There was a woman who was spectacular in her statements about it, that she’s had tremendous success with it and they took her voice off. I don’t know why they took her off. Maybe they had a good reason, maybe they didn’t.”

Facebook and YouTube did not answer questions about multiple versions of the video that remained online on Tuesday afternoon. Twitter said it was “continuing to take action on new and existing tweets with the video.”

The members of the group behind Monday’s video say they are physicians treating patients infected with the coronavirus. But it was unclear where many of them practice medicine or how many patients they had actually seen. As early as May, anti-Obamacare conservative activists called the Tea Party Patriots Action reportedly worked with some of them to advocate loosening states’ restrictions on elective surgeries and nonemergency care. On July 15, the group registered a website called “America’s Frontline Doctors,” domain registration records show.

One of the first copies of the video that appeared on Monday was posted to the Tea Party Patriots’ YouTube channel, alongside other videos featuring the members of “America’s Frontline Doctors.”

The doctors have also been promoted by conservatives like Brent Bozell, founder of the Media Research Center, a nonprofit media organization.