Researchers Reveal Startling Trend: Most Shares on Social Media Happen Without Clicking the Links - TUN


Researchers Reveal Startling Trend: Most Shares on Social Media Happen Without Clicking the Links - TUN

A new Penn State study reveals that more than 75% of Facebook shares occur without users clicking the links, highlighting a significant issue in how misinformation proliferates on social media.

If you've read this far, you're in the minority. According to a new study led by Penn State researchers, the vast majority of social media users share links without reading the articles first. This alarming trend was unearthed through the analysis of over 35 million public posts on Facebook between 2017 and 2020, revealing that about 75% of links were shared without being clicked.

"It was a big surprise to find out that more than 75% of the time, the links shared on Facebook were shared without the user clicking through first," corresponding author S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State, said in a news release. "I had assumed that if someone shared something, they read and thought about it, that they're supporting or even championing the content. You might expect that maybe a few people would occasionally share content without thinking it through, but for most shares to be like this? That was a surprising, very scary finding."

The study, published in Nature Human Behavior, shows a worrying pattern: more politically charged content from both liberal and conservative sources is shared without clicking, compared to neutral content. This kind of superficial engagement contributes significantly to the spread of misinformation.

The researchers determined political content using machine learning to classify political terms, assigning scores based on how frequently content was shared along a political spectrum. They used independent rating systems to validate these scores, manually sorting thousands of links to train their algorithm.

"We created this new variable of political affinity of content based on 35 million Facebook posts during election season across four years. This is a meaningful period to understand macro-level patterns behind social media news sharing," co-author Eugene Cho Snyder, an assistant professor at New Jersey Institute of Technology, said in the news release.

The research used data provided by Meta, Facebook's parent company, through Social Science One and Harvard University's Institute for Quantitative Social Science. It included demographic and behavioral metrics like the "political page affinity score," which categorized users based on their followed pages.

The implications are sobering. Sundar pointed out that users might share content without realizing it's false, assuming it's credible because it looks familiar or aligns with their beliefs.

Meta's third-party fact-checking service flagged nearly 3,000 of the shared URLs in the study as false, which were shared over 41 million times without clicks -- 82% from conservative users and 14.25% from liberal users.

"A pattern emerged that was confirmed at the level of individual links," Snyder added. "The closer the political alignment of the content to the user -- both liberal and conservative -- the more it was shared without clicks. ... They are simply forwarding things that seem on the surface to agree with their political ideology, not realizing that they may sometimes be sharing false information."

To counter this phenomenon, Sundar suggests social media platforms might introduce friction to slow the sharing process, such as prompts to read before sharing.

"Superficial processing of headlines and blurbs can be dangerous if false data are being shared and not investigated," added Sundar.

He emphasized that this wouldn't stop intentional disinformation campaigns but could encourage users to be more discerning.

The push for more media literacy and responsibility lies at the heart of this discovery. With misinformation having played a notable role in the 2016 and 2020 elections, there's a critical need for users to vet content before sharing.

"The reason this happens may be because people are just bombarded with information and are not stopping to think through it," Sundar added. "In such an environment, misinformation has more of a chance of going viral. Hopefully, people will learn from our study and become more media literate, digitally savvy and, ultimately, more aware of what they are sharing."

The findings not only shed light on how users interact with content in the information age but underscore the need for more responsible sharing practices to curb the spread of misinformation.

Previous articleNext article

POPULAR CATEGORY

corporate

9366

tech

10670

entertainment

11427

research

5174

misc

12073

wellness

9198

athletics

12087