Posted on

On Election Day, Facebook and Twitter Did Better by Making Their Products Worse



Share

That gust of wind you felt coming from Silicon Valley on Wednesday morning was the social media industry’s tentative sigh of relief.

For the last four years, executives at Facebook, Twitter, YouTube and other social media companies have been obsessed with a single, overarching goal: to avoid being blamed for wrecking the 2020 U.S. election, as they were in 2016, when Russian trolls and disinformation peddlers ran roughshod over their defenses.

So they wrote new rules. They built new products and hired new people. They conducted elaborate tabletop drills to plan for every possible election outcome. And on Election Day, they charged huge, around-the-clock teams with batting down hoaxes and false claims.

So far, it appears those efforts have averted the worst. Despite the frantic (and utterly predictable) attempts from President Trump and his allies to undermine the legitimacy of the vote in the states where he is losing, there have been no major foreign interference campaigns unearthed this week, and Election Day itself was relatively quiet. Fake accounts and potentially dangerous groups have been taken down quickly, and Facebook and Twitter have been unusually proactive about slapping labels and warnings in front of premature claims of victory. (YouTube was a different story, as evidenced by the company’s slow, tepid response to a video that falsely claimed that Mr. Trump had won the election.)

The week is young, of course, and there’s still plenty of time for problems. Election-related disinformation is already trending up — some of it targeted at Latinos — and will only increase as votes are challenged in the courts, and conspiracy theorists capitalize on all the uncertainty to undermine confidence in the eventual results.

But the platforms’ worst fears haven’t yet materialized. That’s a good thing, and a credit to the employees of those companies who have been busy enforcing their rules.

At the same time, it’s worth examining how Twitter, Facebook and YouTube are averting election-related trouble, because it sheds light on the very real problems they still face.

For months, nearly every step these companies have taken to safeguard the election has involved slowing down, shutting off or otherwise hampering core parts of their products — in effect, defending democracy by making their apps worse.

They added friction to processes, like political ad-buying, that had previously been smooth and seamless. They brought in human experts to root out extremist groups and manually intervened to slow the spread of sketchy stories. They overrode their own algorithms to insert information from trusted experts into users’ feeds. And as results came in, they relied on the calls made by news organizations like The Associated Press, rather than trusting that their systems would naturally bring the truth to the surface.

Image

Nowhere was this shift more apparent than at Facebook, which for years envisioned itself as a kind of post-human communication platform. Mark Zuckerberg, the company’s chief executive, often spoke about his philosophy of “frictionless” design — making things as easy as possible for users. Other executives I talked to seemed to believe that ultimately, Facebook would become a kind of self-policing machine, with artificial intelligence doing most of the dirty work and humans intervening as little as possible.

But in the lead-up to the 2020 election, Facebook went in the opposite direction. It put in place a new, cumbersome approval process for political advertisers, and blocked new political ads in the period after Election Day. It throttled false claims, and put in place a “virality circuit-breaker” to give fact-checkers time to evaluate suspicious stories. And it temporarily shut off its recommendation algorithm for certain types of private groups, to lessen the possibility of violent unrest. (On Thursday, The New York Times reported that the company was taking other temporary measures to tamp down election-related misinformation, including adding more friction to the process of sharing posts.)

All of these changes may, in fact, make Facebook safer. But they also involve dialing back the very features that have powered the platform’s growth for years. It’s a telling act of self-awareness, as if Ferrari had realized that it could only stop its cars from crashing by replacing the engines with go-kart motors.

Image

YouTube didn’t act nearly as aggressively this week, but it has also changed its platform in revealing ways. Last year, it tweaked its vaunted recommendation algorithm to slow the spread of so-called borderline content. And it started promoting “authoritative sources” during breaking news events, to prevent cranks and conspiracy theorists from filling up the search results.

All of this raises the critical question of what, exactly, will happen once the election is over and the spotlight has swiveled away from Silicon Valley. Will the warning labels and circuit-breakers be retired? Will the troublesome algorithms get turned back on? Do we just revert to social media as normal?

Camille François, the chief innovation officer of Graphika, a firm that investigates disinformation on social media, said it was too early to say whether these companies’ precautions had worked as intended. But she conceded that this level of hypervigilance might not last.

“There were a lot of emergency processes put in place at the platforms,” she said. “The sustainability and the scalability of those processes is a fair question to ask.”

Mr. Pariser said that the platforms’ work to prevent election interference this year raised bigger questions about how they will respond to other threats.

“These platforms are used for really important conversations every day,” Mr. Pariser said. “If you do this for U.S. elections, why not other countries’ elections? Why not climate change? Why not acts of violence?”

These are the right questions to ask. The social media companies may have gotten through election night without a disaster. But as with the election itself, the real fights are still ahead.

Read More