Carnegie Mellon University: CMU Misinformation Researchers Zero in on Climate Change

To more effectively combat misinformation, climate change communicators need to focus their messaging on key conspiracy theories that gain the most traction, according to two researchers at Carnegie Mellon University. In a recent paper, CMU’s Aman Tyagi and Kathleen M. Carley look at the polarization of climate change beliefs on social media, and offer targeted approaches toward reshaping beliefs.

This week, the Intergovernmental Panel on Climate Change released a report outlining faster and more irrevocable changes due to carbon dioxide emissions than previously feared. Tyagi, who earned his Ph.D. in engineering and public policy at CMU, chose to focus on climate misinformation in order to apply his knowledge of computer science to this societal problem.

“There’s no lack of scientific evidence for climate change, but it’s still unfortunately being debated on social media as a partisan issue,” Tyagi said. “If people wrongly believe, then, that it isn’t man-made, we probably won’t see any major policies which successfully curb climate change. That’s the motivation behind this work.”

Tyagi and Carley collected a data set of 38 million unique tweets over 100 weeks. Using a state-of-the-art stance detection method, they were able to sort the data set into groups of believers and disbelievers of climate change science. Of the seven major conspiracy theories they identified, disbelievers primarily shared two: the chemtrails theory, which claims that the trails following high-altitude jets are chemical agents being sprayed for nefarious purposes; and the geo engineering theory, which claims that government experiments are causing climate change.

“Public policy researchers and communicators can largely shift their messaging to focus on these particular conspiracy theories. That will more efficiently target the problem,” Tyagi said.

According to Carley, a professor in the School of Computer Science, climate change conspiracy messaging is more cohesive and effective than the fact-based response aimed to dispel those myths.

“On the believers’ side, there’s no agreement upon consistent messaging and not a lot of coordination among the actors. This paper is a starting point to come together and make a concerted effort to counter this stream of false information,” Carley said.

The research also examines the roll bots — automated accounts that can appear to be actual people — play in the spread of conspiracy theories. As explained by Carley, they aren’t the main drivers of misinformation, but they do contribute.

“Bots are not the central story here, but the role they play in these disinformation campaigns and spreading conspiracies is as amplifiers of the misinformation,” Carley said. “They’re building bridges between people, particularly in the anti-climate change side.”

The paper recommends that social media platforms take steps toward removing bot accounts.

After the study uncovered the threads outlining how conspiracy theories spread out across Twitter, Tyagi said that social media platforms have a responsibility to adjust policies for the sake of global climate efforts.

“We were easily able to find so many conspiracy theories on one social media platform. These companies need to do more to stop the spread of disinformation on their sites,” Tyagi said. “They need to be more transparent, especially regarding the recommender systems they are using that lead to this kind of polarization.”

Tyagi looks to his home country, India, where climate change is already causing drinking water supply issues as it alters the monsoon. No matter what conspiracy comes next, Tyagi knows that climate change is all too real.

“We are seeing the same thing in vaccinations right now. So many who don’t believe in vaccines aren’t taking them, and that’s affecting everyone. It’s the same with climate change. If we aren’t all on board, we will all experience the consequences,” Tyagi said.

Stance Detection
Tyagi and Carley used Ora Pro, a network analytic tool capable of analyzing extremely large amounts of data originally developed by the Center for Computational Analysis of Social and Organizational Systems (CASOS) at CMU. They were able to determine what stance (believer or disbeliever) an individual had taken on climate change based on their tweets. The tool’s algorithm used input labels (like hashtags or URL links that indicated a person’s stance) to break down the 7 million unique users identified in the study into two groups: 3.1 million disbelievers and 3.9 million believers.

Comments are closed.