Don’t Be So Certain That Social Media Is Undermining Democracy

If you just read Atlantic to get yours technology newsyou might get the impression that social media is a Leviathan on the unshakable path to destroy democracy.

The headlines scream that Facebook is a “Doomsday machine“And an autocrat”hostile foreign forces“That made American life”unique stupid. ” Recently Atlantic The title for an article by Jonathan Haidt made it clear: “Yes, social media is really undermining democracy. “

Whatever the journal’s editorial stance, these claims have no empirical basis and it is likely that they will cease to be used any time soon. Scary stories have a way of going viral and capture in ways that “we don’t know yet” won’t.

Professor Haidt, a social psychologist at New York University, notes in Atlantic that “contradictory studies are common in social science research.” However, his essay “undermining democracy” downplays the scope of evidence and critical studies of the concept—to cite one example—bubble filtering.

A team of researchers at the University of Amsterdam asked if we should worry about filter bubbles, and after considering the empirical evidence in 2016, answered “No.”

Professor Axel Bruns, a media scholar at Queensland University of Technology and author of Are filter bubbles real?, reviewed data that support the concepts of echo chambers and filter bubbles, and also data that do not. He concluded that the concepts rested on a “fragile foundation” mainly anecdotal evidence. Bruns had a valid point: Focusing on those theories prevents us from properly confronting the deeper causes of division in both politics and society.

In an in-depth response to Haidt’s Atlantic Piece, New Yorkers Writer Gideon Lewis-Kraus The researchers interviewed admitted that there are very little scientific consensus about the positive or negative impact of social media than many people think. “Research shows that most of us actually exposed to more perspectives on social media than we do in real lifewhere our social networks — in the original usage of the term — are rarely heterogeneous,” wrote Lewis-Kraus, adding that Haidt later told him he no longer thought the echo chamber was “as common a problem as he had imagined. “

More views, more hostility

If we were exposed to more views, it would pose another problem. According to Professor Michael Bang Petersen, a political scientist at Aarhus University, that’s where a lot feel hostile of social media comes from — not because websites make us behave differently, but because they shows us a lot of things that we don’t normally encounter in our daily lives.

While the media and activists have been obsessed with disinformation and misinformation on social media since the 2016 US presidential election, researchers from Harvard University have analyzed the extent of both mainstream and social media coverage of the election and concluded that: “The wave of attention paid to fake news has a phenomenal basis in fact, but at least in 2016 election, it seems to have play a relatively minor role in the overall scheme of things. “

Facebook CEO Sheryl Sandberg and Twitter CEO Jack Dorsey testify during a 2018 Senate Intelligence Committee hearing in Washington, DC

Illustrated by Luis G. Rendon / The Daily Beast / Getty

Accompanying the debate is a feat Google Docs—Combined by sociologist Haidt and Duke University and public policy professor Chris Bail — titled “Social Media and Political Disorder: A Collaborative Review.” It includes studies that found the negative influence of social media on democracy and also other studies that concluded that was not the case or had no results.

Bail has shown that the number of people exposed to fake news is quite low – only two percent of Twitter users regularly see fake news. More importantly, they don’t believe what they read when they see it.

But it is Haidt – who has argued, “Social media may not be the main cause of polarization, but it is an important cause” – that repeatedly highlights the role of social media. society in society. His entire thesis focuses on the 2010s, the decade when social media was practically ubiquitous – and since polarization also increased during that period, Haidt makes negative generalizations. extremely extensive. A lot of them just aren’t scrutinized.

One study examined how political blogs interact with each other in the process 2004 US presidential election, found a highly segregated world of blogging: “Liberals and conservatives are linked primarily in their separate communities, with little cross-linking exchanged between them. than. The division expands into their discussions, with liberal and conservative blogs focusing on various articles, topics, and political figures. No wonder this study is titled “divide us blog.” Its Impressive visualizationshow one red and one blue spot, with very little overlap.

Then there’s the concern of “rabbit holes” – where algorithms are said to take everyday ordinary people and turn them into radical extremists. Professor Brendan Nyhan, a political scientist at Dartmouth College, found that apart from “some anecdotal evidence that this happens”, the more frequent and larger problem is that people “deliberately find this out” search for vile content” through subscriptions, not through recommending algorithms. That means they don’t fall into the hole of radicalization, they choose it.

This is particularly worrisome in peripheries, where radicalized individuals find extremist content reinforcing their tendencies. The should focus on this small segment population. It’s not your average person, but it can be dangerous.

Overall, many of the popular stories about social media (e.g. filter bubbles, echo chambers, fake news, algorithmic radicalization), are simply unfounded. Correlation studies cannot decipher which direction the effects of interest are going.

It is the confusion of correlation with causation that makes these stories so popular.

Techlash Era

As research is complex and ongoing, “it is difficult to say anything on this topic with absolute certainty,” concludes Bail. But we continue to see headlines with absolute certainty, as if the threat posed by social media were an undeniable fact. It cannot. And while scientists are still asking questions, the media continues to raise exclamation points.

The tech backlash against social media is rapidly intensifying since 2017 (year Trump took office). There is a stark difference between empirical evidence and exaggerated claims of doom by the media (which always exaggerate the harms). As Techlash grows, this gap widens.

This is where the escalating rhetoric from hardline critics of the tech world really comes in handy.

… journalists should beware of overconfident techies who brag about their innovation AND overconfident critics who accuse innovation as atrocities.

For example, Haidt’s theme is destruction: “America’s tech companies” are “terminators” who have “created products that now seem to corrode democracy” and “delivered them.” We come to the brink of self-destruction.” It suits the nature of media (before social media): Hearing that we all have to die is far more interesting than a nuanced discussion.

The news media often cover social media through this Techlash filter. “ While Instagram filters make their subjects look shiny and beautiful, “Techlash Filters” use hyperbolic metaphors (like “Doomsday Machine”) to make the social network look worthwhile. more afraid.

In general, the media defines topics for discussion (“what to think about”) and the framework of topics (“how to think about these issues”). For all the concerns about echo chambers in social networks, there is a familiar bias in news media – where journalists look at the work of their colleagues and bring up the same topic from same angle.

However, this mimicry is often offensive and generates false outrage, leading tech companies to simply ignore criticism (even if it’s valid) as mockery. lack of understanding. Exaggeration technology lecture play in their hands. When scared of theories about The evils of social media are overblownIt manifests itself in the PR efforts of companies.

Facebook CEO Mark Zuckerberg talks to Senator John Thune (R-SD) after a joint hearing on Capitol Hill.

Illustrated by Luis G. Rendon / The Daily Beast / Getty

For example, internally at Facebook, Nick Clegg told employees to “listen and learn from criticism when it is fair, and strong pushback when it doesn’t. Citing “bad and polarizing” content on private messaging apps like Telegram, WhatsApp and others, Clegg writes:None of those apps implement content or ranking algorithms. It’s just humans talking to humans without any machines getting in the way. We need to look at ourselves in the mirror and not envelop ourselves in the false comfort that we are simply being manipulated by machines. “

Haidt accuses Meta (parent company of Facebook) of cherry picking studies in company blog feedback to one of the recent Haidt’s Atlantic posts. But Haidt did the same thing in his journal articles (to the contrary). The conflicting findings mean the debate isn’t over yet and there’s still a lot we don’t know.

In a recent Techdirt In the comments section, I argued that journalists should “beware of overconfident techies who brag about their ability to innovate AND overconfident critics who accuse innovation of being cruel. violent.” Their readers should also accept such healthy skepticism.

There is a positive aspect of Techlash pressure. It makes big tech companies think a lot about putting safeguards in advance, asking the question “what could happen?” questions and deploying resources to combat potential harms. That is definitely a good thing.

However, when legislators push big bill to “fix” social mediaThe devil is always in the details, and we must think more about their formation. There are real costs when regulators waste their time on overly simplistic solutions based on inconclusive evidence.

A legislative battle worth fighting would be to remove the shroud around tech companies’ notorious secrecy. As platforms are seen as black boxes and more and more reliant on recommendation algorithms, their business will get easier. Here, greater transparency is crucial.

Independent researchers should be fed more data from big tech companies. That way, we can widen the discussion and leave more room for a society-wide approach, where experts make informed arguments.

Haidt, in Atlantic admits that “We shouldn’t expect the social sciences to ‘solve’ the problem until the 2030s.”

If that’s the case, let’s avoid definitive headlines, because we don’t have anything close to corroborating evidence.

Source link


Goz News: Update the world's latest breaking news online of the day, breaking news, politics, society today, international mainstream news .Updated news 24/7: Entertainment, the World everyday world. Hot news, images, video clips that are updated quickly and reliably.

Related Articles

Back to top button