A Journey Into the Misinformation Fever Swamps


“Fake news” has gone from a hot buzzword popularized during the 2016 presidential campaign to an ever-present phenomenon known more formally as misinformation or disinformation.

Whatever you call it, sowing F.U.D. — fear, uncertainty and doubt — is now a full-time and often lucrative occupation for the malign foreign actors and even ordinary U.S. citizens who try to influence American politics by publishing information they know to be false.

Several of my colleagues here at The New York Times track the trends and shifting tactics of these fraudsters on their daily beats. So I exchanged messages this week with Sheera Frenkel, Tiffany Hsu and Stuart A. Thompson, all three of whom spend their days swimming in the muck brewed by fake news purveyors here and abroad.

Our conversation, lightly edited for length and clarity:

This is a political newsletter, so let me ask my first question this way: What are you seeing out there that is new during this election cycle, in terms of tactics or topics?

Sheera Frenkel: I’d say it’s the way misinformation has shifted slightly, in that you don’t have the same type of superspreaders on platforms like Twitter and Facebook that you did in the 2020 election cycle. Instead, you have lots of smaller-scale accounts spreading misinformation across a dozen or more platforms. It is more pervasive and more deeply entrenched than in previous elections.

The most popular topics are largely rehashes of what was spread in the 2020 election cycle. There are a lot of false claims about voter fraud that we first saw made as early as 2016 and 2018. Newspapers, including The New York Times, have debunked many of those claims. That doesn’t seem to stop bad actors from spreading them or people from believing them.

Then there are new claims, or themes, that are being spread by more fringe groups and extremist movements that we have started to track.

Tiffany Hsu: Sheera first noticed a while back that there was a lot of chatter about “civil war.” And, quickly, we started to see it everywhere — this strikingly aggressive rhetoric that intensified after the F.B.I. searched Mar-a-Lago and with the passage of a bill that will give more resources to the I.R.S.

For example, after the F.B.I. search, someone said on Truth Social, the social media platform started by Trump, that “sometimes clearing out dangerous vermin requires a modicum of violence, unfortunately.”

We have seen a fair amount of “lock and load” chatter. But there is also pushback on the right, with people claiming without evidence that federal law enforcement or the Democrats are planting violent language to frame conservative patriots as extremists and insurrectionists.

Stuart A. Thompson: I’m always surprised by how much organization is happening around misinformation. It’s not just family members sharing fake news on Facebook anymore. There’s a lot of money sloshing around. There are lots of very well-organized groups that are trying to turn the attention over voter fraud and other conspiracy theories into personal income and political results. It’s a very organized machine at this point, after two years of organizing around the 2020 election. This feels different from previous moments when disinformation seemed to take hold in the country. It’s not just a fleeting interest spurred by a few partisan voices. It’s an entire community and social network and pastime for millions of people.

Sheera, you’ve covered Silicon Valley for years. How much progress would you say the big social media players — Facebook/Meta, Twitter and Google, which owns YouTube — have made in tackling the problems that arose during the 2016 election? What’s working and what’s not?

Sheera: When we talk about 2016, we are largely talking about foreign election interference. In that case, Russia tried to interfere with U.S. elections by using social media platforms to sow divisions among Americans.

Today, the problem of foreign election interference hasn’t been solved, but it is nowhere near at the scale it once was. Companies like Meta, which owns Facebook, and Twitter announce regular takedowns of networks run by Russia, Iran and China aiming to spread disinformation or influence people online. Millions have been spent on security teams at those companies to make sure they are removing foreign actors from spreading disinformation.

And while it is not a done deal (bad actors are always innovating!), they’ve made a huge amount of progress in taking down these networks. This week, they even announced for the first time that they had removed a foreign influence op promoting U.S. interests abroad.

What has been harder is what to do about Americans’ spreading misinformation to other Americans, and what to do with fringe political movements and conspiracies that continue to spread under the banner of free speech.

Many of these social media companies have ended up exactly in the position they hoped to avoid — making one-off decisions on when they remove movements like the QAnon conspiracy group or voter fraud misinformation that begins to go viral.



How Times reporters cover politics.
We rely on our journalists to be independent observers. So while Times staff members may vote, they are not allowed to endorse or campaign for candidates or political causes. This includes participating in marches or rallies in support of a movement or giving money to, or raising money for, any political candidate or election cause.

Tiffany, you’re coming to this beat with fresh eyes. What have you found most surprising since you began reporting on this subject?

Tiffany: The speed with which rumors and conspiracy theories are created and spread was stunning to me. I remember scrambling to report my first official story on the beat, with Sheera and Stuart, about the viral falsehoods that circulated after the Uvalde shooting. I heard about the attack within an hour of it beginning and quickly began checking social networks and online forums. By then, false narratives about the situation had begun to mutate and dozens of copycat accounts pretending to belong to the gunman had already appeared.

Stuart, what do you think we in the political journalism world miss or get wrong on your beat? I know some reporters privately think some of the breathless claims about how Russia affected the 2016 election were overblown. Is there a disconnect between how tech types and political types see the problems?

Stuart: My sense from the public (and maybe some political reporters) is that this is a momentary problem and one we will solve. Russia had a significant role in spreading disinformation in 2016, which got a lot of attention — maybe too much compared to the even more significant role that Americans played in spreading falsehoods that year.

America’s own disinformation problem has only gotten much worse. About 70 percent of Republicans suspect fraud in the 2020 presidential election. That’s millions and millions of people. They are extremely devoted to these theories, based on hardly any evidence, and will not be easily swayed to another perspective. That belief created a cottage industry of influencers, conferences and organizations devoted to converting the conspiracy theory into political results, including running candidates in races from election board to governor and passing laws that limit voting access.

And it’s working. In Arizona, Michigan, Nevada and Pennsylvania, Republicans who back the voter-fraud myth won primary races for governor, attorney general or secretary of state — often trouncing more establishment candidates who generally supported the 2020 results. If they win in the general election, they could effectively control how elections are run in their states.

So, say whatever you will about Russia in 2016. Despite major efforts by social media companies to crack down on falsehoods, the disinformation problem is much worse today than it was then. And that’s not going away.

Have any of you detected a sense, after Covid, that sometimes the social media companies went too far in censoring views that were contrarian or outside the mainstream? Or is the conventional wisdom that they didn’t go far enough?

Stuart: No one envies the position that social media companies find themselves in now. Misinformation does real damage, especially with Covid, and social media companies bear responsibility to limit its spread.

Do they go too far sometimes? Maybe. Do they not go far enough sometimes? Maybe. Moderating disinformation isn’t a perfect science. Right now, the most reasonable thing we can hope for is that social media companies invest deeply in their moderation practices and continue to refine their approaches so that false information does less damage.

Thank you for reading On Politics, and for being a subscriber to The New York Times. We’ll see you on Monday. — Blake

Read past editions of the newsletter here.

If you’re enjoying what you’re reading, please consider recommending it to others. They can sign up here. Browse all of our subscriber-only newsletters here.

Have feedback? Ideas for coverage? We’d love to hear from you. Email us at onpolitics@nytimes.com.



Source link