Patil part of international group studying tools to curb spread of misinformation


By: Ken Bikoff

Apr 28, 2020


Sameer Patil

Sameer Patil, an assistant professor at the Luddy School of Informatics, Computing, and Engineering, is part of an international group of researchers who studied how the spread of misinformation on social media can be curbed using tools to rate credibility of the item.

In “Effects of Credibility Indicators on Social Media News Sharing Intent,” a study published at the Association for Computing Machinery CHI Conference on Human Factors in Computing Systems, Patil and his colleagues explored the impact of various credibility indicators to decrease the propensity of users to spread misinformation, or “fake news.” The indicators referred to four common information sources—fact checkers, the news media, the general public, and artificial intelligence—and the researchers found that their effect varied. There were further differences in the effectiveness of the indicators based on demographics and political orientation.

Patil has previously been interested in the spread of misinformation, and a colleague of his at New York University, Nasir Memon, put him in touch with two researchers at NYU Abu Dhabi, Waheeb Yaqub and Otari Kakhidze. That duo was interested in conducting a study to understand people’s motivations behind sharing misinformation on social media while exploring ways to curb, and Patil saw the opportunity to collaborate. Patil recruited an IU undergraduate student double majoring in political science and informatics, Morgan Brockman, to help with the effort.

The researchers used a set of 72 news headlines—24 each in categories of true, false, and satire—and each study participant was presented with 12 headlines in random order. True stories received no indicator. For false or satirical headlines, a red credibility indicator was shown informing the participant that a fact checker, the news media, the general public, or an AI algorithm had flagged the story to be less-than-credible. Participants were then asked whether they would be likely to share the corresponding article with their friends on social media, such as Facebook.

The indicators made it 2.5 times less likely that participants would share non-true headlines when compared to the control group, which had no indicators. The fact checkers indicator was most successful in dissuading participants to share non-true headlines, while the AI system was least likely to make an impact. The news media and the general public made only a slightly better impact than the AI.

The researchers also collected information regarding the participants’ political affiliation in one of three categories, Democrat, Republican, or Independent. Republicans and Independents were more likely than Democrats to share non-true headlines when compared to the control group, and all three groups found fact checkers to be the best indicator with the news media and the general public making a lesser impact. An AI indicator, however, led to an actual increase in willingness to share a non-true story relative to the control group.

“Fact checkers appeared to have the biggest impact on reducing people’s inclination to share non-true stories,” Patil said. “This finding applied across demographics and political affiliations. However, the AI method rated poorly compared to other indicators; in fact, it had the opposite of the intended effect when it came to Republicans.”

Although the findings are a promising step, finding ways to identify and alert users to misinformation remains a critical challenge that requires future work that builds on research such as Patil’s.

“The biggest danger posed by fake news is the collateral damage where pretty much everything is viewed with distrust as fake news,” Patil said. “That can greatly undermine the distribution and impact of legitimate and factual information, which is critical in a democracy.”

The CHI 2020 conference was scheduled to be held April 25-30 in Honolulu, Hawai’I but was canceled because of the ongoing COVID-19 pandemic..

“Identifying ways to solve real-world problems is one of our main priorities at the Luddy School,” said Kay Connelly, the associate dean of research at Luddy. “Sameer’s research provides fantastic insight into one of the more important issues facing our society, and it shows how much of an impact collaboration can have on research. It’s also great to have an undergraduate student as part of the work, which highlights the research opportunities available for students at the school.”

Media Contact

Ken Bikoff
Communications Specialist
Phone: (812) 856-6908

Reposted from Luddy News with permission