19.9 C
New York

A Journey Into Misinformation on Social Media

Published:

“Fake news” has gone from a hot buzzword popularized in the course of the 2016 presidential campaign to an ever-present phenomenon known more formally as misinformation or disinformation.

Whatever you call it, sowing F.U.D. — fear, uncertainty and doubt — is now a full-time and infrequently lucrative occupation for the malign foreign actors and even atypical U.S. residents who attempt to influence American politics by publishing information they know to be false.

Several of my colleagues here at The Latest York Times track the trends and shifting tactics of those fraudsters on their day by day beats. So I exchanged messages this week with Sheera Frenkel, Tiffany Hsu and Stuart A. Thompson, all three of whom spend their days swimming within the muck brewed by fake news purveyors here and abroad.

Our conversation, calmly edited for length and clarity:

It is a political newsletter, so let me ask my first query this fashion: What are you seeing on the market that’s latest during this election cycle, when it comes to tactics or topics?

Sheera Frenkel: I’d say it’s the way in which misinformation has shifted barely, in that you just don’t have the identical sort of superspreaders on platforms like Twitter and Facebook that you just did within the 2020 election cycle. As an alternative, you’ve plenty of smaller-scale accounts spreading misinformation across a dozen or more platforms. It’s more pervasive and more deeply entrenched than in previous elections.

The preferred topics are largely rehashes of what was spread within the 2020 election cycle. There are lots of false claims about voter fraud that we first saw made as early as 2016 and 2018. Newspapers, including The Latest York Times, have debunked lots of those claims. That doesn’t appear to stop bad actors from spreading them or people from believing them.

Then there are latest claims, or themes, which can be being spread by more fringe groups and extremist movements that we have now began to track.

Tiffany Hsu: Sheera first noticed some time back that there was lots of chatter about “civil war.” And, quickly, we began to see it all over the place — this strikingly aggressive rhetoric that intensified after the F.B.I. searched Mar-a-Lago and with the passage of a bill that can give more resources to the I.R.S.

For instance, after the F.B.I. search, someone said on Truth Social, the social media platform began by Trump, that “sometimes clearing out dangerous vermin requires a modicum of violence, unfortunately.”

We’ve seen a good amount of “lock and cargo” chatter. But there’s also pushback on the proper, with people claiming without evidence that federal law enforcement or the Democrats are planting violent language to border conservative patriots as extremists and insurrectionists.

Stuart A. Thompson: I’m at all times surprised by how much organization is going on around misinformation. It’s not only relations sharing fake news on Facebook anymore. There’s lots of money sloshing around. There are plenty of very well-organized groups which can be attempting to turn the eye over voter fraud and other conspiracy theories into personal income and political results. It’s a really organized machine at this point, after two years of organizing across the 2020 election. This feels different from previous moments when disinformation appeared to take hold within the country. It’s not only a fleeting interest spurred by just a few partisan voices. It’s a whole community and social network and hobby for thousands and thousands of individuals.

Sheera, you’ve covered Silicon Valley for years. How much progress would you say the massive social media players — Facebook/Meta, Twitter and Google, which owns YouTube — have made in tackling the issues that arose in the course of the 2016 election? What’s working and what’s not?

Sheera: After we speak about 2016, we’re largely talking about foreign election interference. In that case, Russia tried to interfere with U.S. elections by utilizing social media platforms to sow divisions amongst Americans.

Today, the issue of foreign election interference hasn’t been solved, but it surely is nowhere near at the size it once was. Firms like Meta, which owns Facebook, and Twitter announce regular takedowns of networks run by Russia, Iran and China aiming to spread disinformation or influence people online. Hundreds of thousands have been spent on security teams at those corporations to be certain that they’re removing foreign actors from spreading disinformation.

And while it is just not a done deal (bad actors are at all times innovating!), they’ve made an enormous amount of progress in taking down these networks. This week, they even announced for the primary time that that they had removed a foreign influence op promoting U.S. interests abroad.

What has been harder is what to do about Americans’ spreading misinformation to other Americans, and what to do with fringe political movements and conspiracies that proceed to spread under the banner of free speech.

Lots of these social media corporations have ended up exactly within the position they hoped to avoid — making one-off decisions on once they remove movements just like the QAnon conspiracy group or voter fraud misinformation that begins to go viral.

How Times reporters cover politics.
We depend on our journalists to be independent observers. So while Times staff members may vote, they usually are not allowed to endorse or campaign for candidates or political causes. This includes participating in marches or rallies in support of a movement or giving money to, or raising money for, any political candidate or election cause.

Tiffany, you’re coming to this beat with fresh eyes. What have you ever found most surprising because you began reporting on this subject?

Tiffany: The speed with which rumors and conspiracy theories are created and spread was stunning to me. I remember scrambling to report my first official story on the beat, with Sheera and Stuart, in regards to the viral falsehoods that circulated after the Uvalde shooting. I heard in regards to the attack inside an hour of it starting and quickly began checking social networks and online forums. By then, false narratives in regards to the situation had begun to mutate and dozens of copycat accounts pretending to belong to the gunman had already appeared.

Stuart, what do you’re thinking that we within the political journalism world miss or get fallacious in your beat? I do know some reporters privately think among the breathless claims about how Russia affected the 2016 election were overblown. Is there a disconnect between how tech types and political types see the issues?

Stuart: My sense from the general public (and possibly some political reporters) is that this can be a momentary problem and one we are going to solve. Russia had a major role in spreading disinformation in 2016, which got lots of attention — possibly an excessive amount of in comparison with the much more significant role that Americans played in spreading falsehoods that yr.

America’s own disinformation problem has only gotten much worse. About 70 percent of Republicans suspect fraud within the 2020 presidential election. That’s thousands and thousands and thousands and thousands of individuals. They’re extremely dedicated to these theories, based on hardly any evidence, and won’t be easily swayed to a different perspective. That belief created a cottage industry of influencers, conferences and organizations dedicated to converting the conspiracy theory into political results, including running candidates in races from election board to governor and passing laws that limit voting access.

And it’s working. In Arizona, Michigan, Nevada and Pennsylvania, Republicans who back the voter-fraud myth won primary races for governor, attorney general or secretary of state — often trouncing more establishment candidates who generally supported the 2020 results. In the event that they win in the overall election, they might effectively control how elections are run of their states.

So, say whatever you’ll about Russia in 2016. Despite major efforts by social media corporations to crack down on falsehoods, the disinformation problem is way worse today than it was then. And that’s not going away.

Have any of you detected a way, after Covid, that sometimes the social media corporations went too far in censoring views that were contrarian or outside the mainstream? Or is the standard wisdom that they didn’t go far enough?

Stuart: Nobody envies the position that social media corporations find themselves in now. Misinformation does real damage, especially with Covid, and social media corporations bear responsibility to limit its spread.

Do they go too far sometimes? Possibly. Do they not go far enough sometimes? Possibly. Moderating disinformation isn’t an ideal science. Right away, essentially the most reasonable thing we are able to hope for is that social media corporations invest deeply of their moderation practices and proceed to refine their approaches in order that false information does less damage.

Thanks for reading On Politics, and for being a subscriber to The Latest York Times. We’ll see you on Monday. — Blake

Read past editions of the newsletter here.

For those who’re having fun with what you’re reading, please consider recommending it to others. They will join here. Browse all of our subscriber-only newsletters here.

Have feedback? Ideas for coverage? We’d love to listen to from you. Email us at onpolitics@nytimes.com.

sportinbits@gmail.com
sportinbits@gmail.comhttps://sportinbits.com
Get the latest Sports Updates (Soccer, NBA, NFL, Hockey, Racing, etc.) and Breaking News From the United States, United Kingdom, and all around the world.

Related articles

spot_img

Recent articles

spot_img