Posted on

What Is QAnon, the Viral Pro-Trump Conspiracy Theory?

If you’re spending a lot of time online these days — and thanks to the pandemic, many of us are — you’ve probably heard of QAnon, the sprawling internet conspiracy theory that has taken hold among some of President Trump’s supporters.

But unless you’re very online, you likely still have questions about what exactly is going on.

QAnon was once a fringe phenomenon — the kind most people could safely ignore. But in recent months, it’s gone mainstream. Twitter, Facebook and other social networks have been flooded with QAnon-related false information about Covid-19, the Black Lives Matter protests and the 2020 election. QAnon supporters have also been trying to attach themselves to other activist causes, such as the anti-vaccine and anti-child-trafficking movements, in an effort to expand their ranks.

QAnon has also seeped into the offline world, with some believers charged with violent crimes, including one QAnon follower accused of murdering a mafia boss in New York last year and another who was arrested in April and accused of threatening to kill Joseph R. Biden Jr., the presumptive Democratic presidential nominee. The Federal Bureau of Investigation has warned that QAnon poses a potential domestic terror threat.

Last week, QAnon reached a new milestone when Marjorie Taylor Greene, an avowed QAnon supporter from Georgia, won a Republican primary in a heavily conservative district, setting her up for a near-certain election to Congress in November. After Ms. Greene’s win, Mr. Trump called her a “future Republican star.”

QAnon is an incredibly convoluted theory, and you could fill an entire book explaining its various tributaries and sub-theories. But here are some basic things you should know.

QAnon is the umbrella term for a sprawling set of internet conspiracy theories that allege, falsely, that the world is run by a cabal of Satan-worshiping pedophiles who are plotting against Mr. Trump while operating a global child sex-trafficking ring.

QAnon followers believe that this clique includes top Democrats including Hillary Clinton, Barack Obama and George Soros, as well as a number of entertainers and Hollywood celebrities like Oprah Winfrey, Tom Hanks, Ellen DeGeneres and religious figures including Pope Francis and the Dalai Lama. Many of them also believe that, in addition to molesting children, members of this group kill and eat their victims in order to extract a life-extending chemical from their blood.

According to QAnon lore, Mr. Trump was recruited by top military generals to run for president in 2016 in order to break up this criminal conspiracy, end its control of politics and the media, and bring its members to justice.

Not by a long shot. Since it began, QAnon has incorporated elements of many other conspiracy theory communities, including claims about the assassination of John F. Kennedy, the existence of U.F.O.s, and the 9/11 “truther” movement.

QAnon Anonymous, a podcast about the QAnon movement, calls QAnon a “big tent conspiracy theory” because it is constantly evolving and adding new features and claims. But the existence of a global pedophile cabal is the core tenet of QAnon, and the one that most, if not all, of its followers believe.

In October 2017, a post appeared on 4chan, the notoriously toxic message board, from an anonymous account calling itself “Q Clearance Patriot.” This poster, who became known simply as “Q,” claimed to be a high-ranking intelligence officer with access to classified information about Mr. Trump’s war against the global cabal.

Q predicted that this war would soon culminate in “The Storm” — an appointed time when Mr. Trump would finally unmask the cabal, punish its members for their crimes and restore America to greatness.

Image
Credit…Pool photo by Andrew Harrer

It’s a reference to a cryptic remark Mr. Trump made during an October 2017 photo op. Posing alongside military generals, Mr. Trump said, “You guys know what this represents? Maybe it’s the calm before the storm.”

QAnon believers pointed to this moment as proof that Mr. Trump was sending coded messages about his plans to break up the global cabal, with the help of the military.

Q’s identity is still unknown, although there have been hints and speculation about it for years. Some speculate that a single internet troll has been posting as Q the entire time; others say that multiple people are involved in posting as Q, or that Q’s identity has changed over time.

Making things more complicated is that Q’s online home base has changed several times. Q’s posts originally appeared on 4chan. Then they moved to 8chan, where they stayed until that site was taken offline last year after the El Paso mass shooting. They now live on 8kun, a site run by the former owner of 8chan. Each of these sites uses a system of identity verification known as a “tripcode” — essentially, a unique digital signature that proves that a series of anonymous posts were written by the same person or people.

“Drops” are what QAnon followers call Q’s posts. There have been nearly 5,000 of them so far, and most take the form of a cryptic coded message.

Here’s an example of a Q drop from September 2018:

PANIC IN DC

[LL] talking = TRUTH reveal TARMAC [BC]?

[LL] talking = TRUTH reveal COMEY HRC EMAIL CASE?

[LL] talking = TRUTH reveal HUSSEIN instructions re: HRC EMAIL CASE?

[LL] talking = TRUTH reveal BRENNAN NO NAME COORD TO FRAME POTUS?……………..FISA = START

FISA BRINGS DOWN THE HOUSE.WHEN DO BIRDS SING?

Q

In this post, you can see coded references to “LL” (Loretta Lynch, President Obama’s former attorney general), “BC” (Bill Clinton), “HRC” (Hillary Rodham Clinton), and “HUSSEIN” (President Obama), along with references to John Brennan, the former director of the Central Intelligence Agency, the Foreign Intelligence Surveillance Act, and “POTUS” — President Trump.

Many QAnon followers use “Q Drop” apps that collect all of Q’s posts in one place, and alert them every time a new post arrives. (One of these apps hit the top 10 paid apps in Apple’s App Store before it was pulled down for violating the company’s guidelines.) They then post these drops in Facebook groups, chat rooms for the Discord chat app and Twitter threads, and begin discussing and debating what it all means.

Yes and no. QAnon has been described as a “big-budget sequel” to Pizzagate, because it takes the original Pizzagate conspiracy theory — which alleged, falsely, that Mrs. Clinton and her cronies were operating a child sex-trafficking ring out of the basement of a Washington, D.C., pizza restaurant — and adds many more layers of narrative on top of it. But many people believe in both theories, and for many QAnon believers, Pizzagate represented a kind of conspiracy theory on-ramp.

One new element in QAnon is a number of clear and specific predictions about when and how “The Storm” would play out. For years, Q has predicted that mass arrests of cabal members would occur on certain days, that certain government reports would reveal the cabal’s misdeeds, and that Republicans would win numerous seats in the 2018 midterm elections.

None of those predictions came true. But most QAnon believers didn’t care. They simply found ways to reframe the narrative and ignore the discrepancies, and moved on.

It’s hard to say, because there’s no official membership directory, but the number is not small. Even if you count only the hard-core QAnon believers — excluding “QAnon-lite” adherents who might believe in a deep state plot against Mr. Trump, but not a cabal of child-eating Satanists — the number may be at least in the hundreds of thousands.

Some of the most popular QAnon groups on Facebook have more than 100,000 members apiece, and Twitter recently announced it was taking actions to limit the reach of more than 150,000 QAnon-associated accounts. A recent report by NBC News found that Facebook had conducted an internal study of QAnon’s presence on its platform, and it concluded that there were thousands of QAnon groups, with millions of members between them.

That number has probably grown during the pandemic, as people stuck indoors turn to the internet for entertainment and socializing and wind up being pulled into the QAnon community. A recent article in The Wall Street Journal found that membership in 10 large Facebook groups devoted to QAnon had grown by more than 600 percent since the start of lockdowns.

A common misconception is that QAnon is purely a political movement. But it functions, for people who believe in it, as both a social community and a source of entertainment.

Some people have compared QAnon to a massive multiplayer online game, because of the way it invites participants to cocreate a kind of shared reality filled with recurring characters, shifting story lines and intricate puzzle-solving quests. QAnon has also been compared to a church, in that it provides its followers with a social support structure as well as an organizing narrative for their everyday lives.

Adrian Hon, a game designer who has written about QAnon’s similarity to alternate-reality games, says that believers “open a fascinating fantasy world of secret wars and cabals and Hillary Clinton controlling things, and it offers convenient explanations for things that feel inexplicable or wrong about the world.”

Even though Q’s posts appear on fringe message boards, the QAnon phenomenon owes much of its popularity to Twitter, Facebook and YouTube, which have amplified QAnon messages and recommended QAnon groups and pages to new people through their algorithms.

In addition, QAnon believers have used social media to harass, intimidate and threaten their perceived enemies, and to seed other types of misinformation that wind up influencing public debate. Several of the most popular conspiracy theories on the internet this year — such as “Plandemic,” a documentary containing false and dangerous claims about Covid-19, and a viral conspiracy theory that falsely claimed that Wayfair, the online furniture company, was trafficking children — have been amplified and popularized by QAnon followers.

Some of these networks have started trying to remove QAnon content from their platforms. Twitter recently banned thousands of QAnon accounts, saying they had engaged in coordinated harassment. Facebook is reportedly coming up with its own QAnon containment strategy. But these interventions may be too little, too late.

It’s true that much of QAnon’s subject matter is recycled from earlier conspiracy theories. But QAnon is fundamentally an internet-based movement that operates in a different way, and at a different scale, than anything we’ve seen before.

For starters, QAnon is deeply participatory, in a way that few other popular conspiracy theories have been. Followers congregate in chat rooms and Facebook groups to decode the latest Q posts, discuss their theories about the news of the day, and bond with their fellow believers. The Atlantic has called it “the birth of a new religion.”

There’s also the basic danger of what QAnon followers actually believe. It’s one thing to have a polarized political discourse with heated disagreements; it’s another to have a faction of Americans who think, with complete sincerity, that the leaders of the opposition party are kidnapping and cannibalizing innocent children.

Combine those violent, paranoid fantasies with the fact that QAnon followers have been charged with committing serious crimes in Q’s name, and it’s no wonder people are worried.

Mr. Trump is the central and heroic figure in QAnon’s core narrative — the brave patriot who was chosen to save America from the global cabal. As a result, QAnon believers parse Mr. Trump’s words and actions closely, looking for hidden meanings. When Mr. Trump says the number 17, they take it as a sign that he is sending secret messages to them. (Q is the 17th letter of the alphabet.) When he wears a pink tie, they interpret it as a sign that he is freeing trafficked children. (Some hospitals use “code pink” as a shorthand for a child abduction in progress.)

Mr. Trump has never directly addressed QAnon, but he recently declined to denounce or disavow the movement when asked about his support for Ms. Green, the QAnon-affiliated congressional candidate. And he has shared posts from QAnon followers dozens of times on his social media accounts.

Image

Credit…Stephanie Keith/Reuters

Yes. For months, QAnon followers have been hijacking #SaveTheChildren — which started out as a fund-raising campaign for a legitimate anti-child-trafficking organization — as a recruiting tactic.

What they’re doing, basically, is using false and exaggerated claims about child trafficking to attract the attention of a new audience — in this case, worried parents. Then, they attempt to steer the conversation to QAnon talking points — saying that the reason children are being trafficked, for example, is because the global cabal wants to harvest a supposedly life-extending chemical from their blood.

This particular tactic has been especially problematic for legitimate anti-trafficking groups, who have had to deal with clogged hotlines and rampant misinformation as QAnon has latched on to their issue.

Merely posting #SaveTheChildren doesn’t mean your friends are QAnon believers. They could have just stumbled on a post about child trafficking that resonated with them and decided to share it. But they, and you, should know that those posts are part of a concerted QAnon strategy.

Read More

Posted on

41 Cities, Many Sources: How False Antifa Rumors Spread Locally

In recent weeks, as demonstrations against racism spread across the country, residents in at least 41 U.S. cities and towns became alarmed by rumors that the loose collective of anti-fascist activists known as antifa was headed to their area, according to an analysis by The New York Times. In many cases, they contacted their local law enforcement for help.

In each case, it was for a threat that never appeared.

President Trump has spread some unfounded rumors about antifa to a national audience — including his accusation, without evidence, that a 75-year-old Buffalo protester who was hospitalized after being knocked down by police could be “an antifa provocateur.

But on the local level, the source of the false information has usually been more subtle, and shows the complexity of stunting misinformation online. The bad information often first appears in a Twitter or Facebook post, or a YouTube video there. It then gets shared on online spaces like local Facebook groups, the neighborhood social networking app Nextdoor and community texting networks. These posts can fall under the radar of the tech companies and online fact checkers.

“The dynamic is tricky because many times these local groups don’t have much prior awareness of the body of conspiratorial content surrounding some of these topics,” said Renée DiResta, a disinformation researcher at the Stanford Internet Observatory. “The first thing they see is a trusted fellow community member giving them a warning.”

Here are four ways that antifa falsehoods spread in local communities.

Image

On the last weekend in May, the police in Sioux Falls, S.D., decided to investigate whether busloads of antifa protesters were headed to town. It shows what can happen from a single tweet.

They were responding to a rumor spreading quickly among local residents online, and first posted to Twitter by the local Chamber of Commerce.

“We’re being told that buses are en route from Fargo for today’s march downtown…,” the group posted on Twitter. “Please bring in any furniture, signs, etc. that could be possibly thrown through windows.”

The tweet was later deleted, but not before the rumor spread verbatim on Facebook, where it was even translated into Spanish. On Facebook, screenshots of the tweet and other posts about the group’s message collected more than 4,600 likes and shares according to CrowdTangle, a Facebook-owned tool that analyzes interactions across social media.

These included shares by the Facebook pages of three local news outlets with a combined reach of 36,238 followers, and two posts in Spanish-speaking local Facebook groups, which reached 2,611 followers.

Twitter said it had taken down “hundreds of groups” under its violent extremist group policy and “continues to enforce our policies against hateful conduct every day across the world.” Facebook said its fact-checking partners rate many false claims about the protests, including about antifa.

The rumor led dozens of people to reach out to the local police that Sunday, according to Sam Clemens, the public information officer at the Sioux Falls Police Department.

“But on the day of the protests, we didn’t have any evidence of any buses coming from out of town carrying people,” Mr. Clemens said. The majority of protesters were local residents, he said.

The Greater Sioux Falls Chamber of Commerce said it got the information from sources it knew and believed to be credible.

“We received information that led us to believe there was a cause for concern. As such, we wanted to encourage local business owners to take responsible, precautionary steps for their businesses,” said Jeff Griffin, the group’s president. “We removed the post when we realized it was contributing to a different message that we did not intend.”

Image

A false rumor about antifa protesters in Yucaipa, Calif., a city about 70 miles from Los Angeles, started with one viral YouTube video about the city. Before long, it had even reached a national audience.

A YouTube video posted on June 2, featuring scenes of men in masks and holding guns, purportedly residents of the city preparing for “potential antifa looting ahead of a planned BLM protest,” has collected 17,200 views in the days since. Facebook posts of photos claiming to show the Yucaipa residents defending their town were posted at least 587 times in Facebook groups, and amassed over 24,000 likes and shares, according to the Times analysis. They were shared in pro-Trump and far-right Facebook groups, as well as other local community groups.

Farshad Shadloo, a YouTube spokesman, said that, like Facebook, the video service uses fact-checking panels to flag false information, and that the company aims to promote videos from authoritative sources about the protests.

On the same day, the conservative commentator and former Fox News host Todd Starnes published a blog post titled, “TOWN FIGHTS antifa: ‘They Just Beat the Ever-Loving Snot Out of Them.’” It collected over 48,000 likes and shares, and reached three million followers on Facebook.

A day later, the conspiracy website Infowars posted an article about the false narrative, which spread it further among followers of conspiracy groups and several Facebook groups dedicated to praising Mr. Trump.

A representative for Mr. Starnes said he was unavailable to respond.

The Yucaipa Police Department confirmed on Twitter that it had responded to reports of fights in public on June 1, but did not mention the involvement of antifa. A public information officer for the department pointed to a YouTube video posted last week, in which a Yucaipa police lieutenant, Julie Brumm-Landen, said that the city had not experienced looting or destruction from protests of racism.

“The information about antifa or planned criminal activity in Yucaipa is nothing more than internet speculation and false rumors,” Lt. Brumm-Landen added. “Any peaceful protests that takes place will have the full support and protection of the Yucaipa Police Department.” That video was viewed just 100 times.

Image

A congressional candidate over 2,000 miles away from Yucaipa started to spread a similar message. The episode highlights how even when a tech company removes bad local information, it can happen too late.

Marjorie Taylor Greene, a Republican congressional candidate in northwest Georgia and a professed member of the fringe conspiracy theory group QAnon, tweeted an ad for her campaign showing her holding an AR-15-style rifle and threatening antifa activists. “You won’t burn our churches, loot our businesses or destroy our homes,” she said in the ad. It was retweeted 20,000 times.

On Facebook, that same campaign ad was removed from the platform two days later — but not before it racked up over 1.2 million views on the site. According to the social network, the video violated the company’s policies against promoting the use of firearms. “We removed it because it advocates the use of deadly weapons against a clearly defined group of people, which violates our policies against inciting violence,” said Andrea Vallone, a Facebook spokeswoman.

No group of antifa activists arrived in Georgia. But that didn’t seem to hurt Ms. Greene’s political campaign. One week after her ad posted, Ms. Greene finished first in her primary, winning 41 percent of the vote in the strongly Republican 14th Congressional District, and has a strong chance of winning a runoff vote in August.

Ms. Greene, who has a history of making offensive remarks about blacks, Jews and Muslims, appears to have no remorse about spreading unfounded rumors of antifa coming to town.

“I’m sick and tired of watching establishment Republicans play defense while the Fake News Media cheers on antifa terrorists, BLM rioters, and the woke cancel culture, as they burn our cities, loot our businesses, vandalize our memorials, and divide our nation,” Ms. Greene said in an emailed statement.

Image

In late May to early June, there was a rumor that “two bus loads of antifa” were heading to Locust, N.C., about 25 miles east of Charlotte. The rumor was shared in text messages among people in the area — far out of sight of any fact-checking organization.

On June 1, the rumor surfaced in Facebook groups with names like DeplorablePride.org and Albemarle News and Weather.

That same evening, the police in Locust posted a screenshot of a text that had been circulating in the community over the weekend. The text falsely claimed that police officers had been knocking on doors to warn that “a black organization is bringing 2 bus loads of people to walmart in locust with intentions on looting and burning down the suburbs.” The post, made on Facebook, assured residents that the police department had not been spreading the rumor.

Jeffrey Shew, the assistant chief of police at the Locust Police Department, said that all the residents who reached out to the police department to report the buses “had no direct knowledge” of violent protesters coming to town. He said they were only sharing what they saw on social media. By midnight on June 1, Mr. Shew said, it was clear that the rumors were untrue.

“No protests, groups looking to protest, or groups looking to riot occurred,” he said.

On June 2, the police posted another message on Facebook emphasizing that the rumors had no substance. It exemplified that often, local community members themselves are the ones on the front lines of debunking false rumors.

“We had absolutely zero confirmed credible information related to these activities however out of an abundance of caution we did arrange or stage extra resources and officers in Locust in the event there was any legitimacy to the posts,” the post by the Locust Police Department read. “Now in the morning after, we can 100% confirm there was zero truth to any of the posts that we observed.”

Posts containing the original rumor reached 27,855 followers on Facebook, according to the Times analysis. The police’s posts reached 2,966 followers on Facebook.

Read More