Congratulations. Reading this far into the story is a feat not many will accomplish, especially if shared on Facebook, according to a team led by Penn State researchers.
In an analysis of more than 35 million public posts containing links that were shared extensively on the social media platform between 2017 and 2020, the researchers found that around 75% of the shares were made without the posters clicking the link first. Of these, political content from both ends of the spectrum was shared without clicking more often than politically neutral content.
The findings, which the researchers said suggest that social media users tend to merely read headlines and blurbs rather than fully engage with core content, were published today (Nov. 19) in Nature Human Behavior. While the data were limited to Facebook, the researchers said the findings could likely map to other social media platforms and help explain why misinformation can spread so quickly online.
“It was a big surprise to find out that more than 75% of the time, the links shared on Facebook were shared without the user clicking through first,” said corresponding author S. Shyam Sundar, Evan Pugh University Professor and the James P. Jimirro Professor of Media Effects at Penn State.
“I had assumed that if someone shared something, they read and thought about it, that they’re supporting or even championing the content. You might expect that maybe a few people would occasionally share content without thinking it through, but for most shares to be like this? That was a surprising, very scary finding.”
Access to the Facebook data was granted via Social Science One, a research consortium hosted by Harvard University’s Institute for Quantitative Social Science focused on obtaining and sharing social and behavioral data responsibly and ethically. The data were provided in collaboration with Meta, Facebook’s parent company, and included user demographics and behaviors, such as a “political page affinity score.”
This score was determined by external researchers identifying the pages users follow—like the accounts of media outlets and political figures. The researchers used the political page affinity score to assign users to one of five groups—very liberal, liberal, neutral, conservative and very conservative.
To determine the political content of shared links, the researchers in this study used machine learning, a form of artificial intelligence, to identify and classify political terms in the link content. They scored the content on a similar five-point political affinity scale, from very liberal to very conservative, based on how many times each affinity group shared the link.
“We created this new variable of political affinity of content based on 35 million Facebook posts during election season across four years. This is a meaningful period to understand macro-level patterns behind social media news sharing,” said co-author Eugene Cho Snyder, assistant professor of humanities and social sciences at New Jersey Institute of Technology
Discover the latest in science, tech, and space with over 100,000 subscribers who rely on Phys.org for daily insights.
Sign up for our free newsletter and get updates on breakthroughs,
innovations, and research that matter—daily or weekly.
The team validated the political affinity of news domains, such as CNN or Fox, based on the media bias chart produced by AllSides, an independent company focused on helping people understand the biases of news content, and a ratings system developed by researchers at Northeastern University.
With these rating systems, the team manually sorted 8,000 links, first identifying them as political or non-political content. Then the researchers used this dataset to train an algorithm that assessed 35 million links shared more than 100 times on Facebook by users in the United States.
“A pattern emerged that was confirmed at the level of individual links,” Snyder said. “The closer the political alignment of the content to the user—both liberal and conservative—the more it was shared without clicks. … They are simply forwarding things that seem on the surface to agree with their political ideology, not realizing that they may sometimes be sharing false information.”
The findings support the theory that many users superficially read news stories based just on headlines and blurbs, Sundar said, explaining that Meta also provided data from its third-party fact-checking service—which identified that 2,969 of the shared URLs linked to false content.
The researchers found that these links were shared over 41 million times, without being clicked. Of these, 76.94% came from conservative users and 14.25% from liberal users. The researchers explained that the vast majority—up to 82%—of the links to false information in the dataset originated from conservative news domains.
To cut down on sharing without clicking, Sundar said that social media platforms could introduce “friction” to slow the share, such as requiring people to acknowledge that they have read the full content prior to sharing.
“Superficial processing of headlines and blurbs can be dangerous if false data are being shared and not investigated,” Sundar said, explaining that social media users may feel that content has already been vetted by those in their network sharing it, but this work shows that is unlikely. “If platforms implement a warning that the content might be false and make users acknowledge the danger in doing so, that might help people think before sharing.”
This wouldn’t stop intentional misinformation campaigns, Sundar said, and individuals still have a responsibility to vet the content they share.
“Disinformation or misinformation campaigns aim to sow the seeds of doubt or dissent in a democracy—the scope of these efforts came to light in the 2016 and 2020 elections,” Sundar said. “If people are sharing without clicking, they’re potentially playing into the disinformation and unwittingly contributing to these campaigns staged by hostile adversaries attempting to sow division and distrust.”
So why do people share without clicking in the first place?
“The reason this happens may be because people are just bombarded with information and are not stopping to think through it,” Sundar said. “In such an environment, misinformation has more of a chance of going viral. Hopefully, people will learn from our study and become more media literate, digitally savvy and, ultimately, more aware of what they are sharing.”
More information:
S. Shyam Sundar et al, Sharing without clicking on news in social media, Nature Human Behaviour (2024). DOI: 10.1038/s41562-024-02067-4
Provided by
Ashley WennersHerron
Citation:
Social media users probably won’t read beyond this headline, researchers say (2024, November 19)
retrieved 19 November 2024
from https://phys.org/news/2024-11-social-media-users-wont-headline.html
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.