UTS Researcher Receives Top Defense Industry Award
Dr Marian-Andrei Rizoiu’s work countering misinformation and disinformation online, including state-backed information operations and digital propaganda, has been recognised at the 2023 Australian Defence Industry Awards, with the prestigious Excellence Award as well as Academic of the Year.
“My main goal is to understand how information flows online, and what makes it viral. I develop tools to study misinformation and disinformation: to detect it, predict its importance, and design countermeasures,” said Dr Rizoiu, who leads the Behavioural Data Science Lab at the University of Technology Sydney (UTS).
“Winning this award is a recognition of my research’s value. It confirms that my interdisciplinary approach, methods, tools and data are on the right path to solving real-world problems for Defence,” he said.
Professor Peta Wyeth, Dean of the UTS Faculty of Engineering and IT, congratulated Dr Rizoiu on this significant achievement.
“These awards recognise the significant contributions of Marian-Andrei’s research, as he collaborates with Defence partners to address the escalating challenge of misinformation detection,” she said.
Image: supplied
Dr Rizoiu said while online social media was intended to provide equal access to information, it has become a breeding ground for harmful content such as misinformation, conspiracy theories, and extremist narratives.
“We witnessed protests against mask-wearing during COVID-19 and a rise in unfounded fears surrounding vaccinations. Past experiences show that our deliberative democratic processes, such as elections, are vulnerable to online influence and opinion manipulation,” Dr Rizoiu said.
His research combats these issues, addressing the lifecycle of misinformation. His team has built a software system designed for law enforcement, security and intelligence agencies, which allows them to monitor, detect and react to information operations (IO).
“I wanted to apply my research to an application with real-world impact to solve a real-world issue. The more I dug into the topic of mis- and disinformation the more I realised this is a real societal issue bound to stay and impact our society.
I develop tools to study misinformation and disinformation: to detect it, predict its importance, and design countermeasures.
Dr Marian-Andrei Rizoiu
Dr Rizoiu developed a four-step process to detect and react to online misinformation. The first was to build tools that could monitor online discussion spaces. The second was to employ advanced AI techniques to detect extremist ideology, foreign interference, and their corresponding narratives.
The third step was to use Machine Learning tools to forecast the effectiveness of these narratives, predicting the number of users they will reach and how long they will persist. This allows those monitoring online media to the prioritise which operations to counter.
Lastly, Dr Rizoiu developed mitigation techniques, both reactive (debunking) and proactive (‘prebunking’ or helping people recognise false claims before encountering them), to combat the spread of misinformation. He also developed a unique approach to detecting IO agents.
“Given the sophisticated tactics employed by IO agents, I recognised that traditional methods of analysing messaging content was not sufficient. As a result, I shifted the paradigm and proposed a new methodology that investigates the reaction patterns of the social systems and target users.
“By studying the specific reactions elicited by IO agents, I could identify patterns indicative of inauthentic accounts, commonly known as sock-puppets or trolls.
“This approach proved to be a significant breakthrough, as it circumvented the language barriers and the diversified discussion topics employed by these actors. Rather than analysing content, the focus shifted to understanding the responses generated, thus enabling scalable detection of IO agents.”
Dr Rizoiu’s work has contributed to shaping policy and regulation around mis- and disinformation, and he currently leads research projects valued at over $1.8 million from the Department of Home Affairs, Defence Science and Technology and Facebook.
The future direction of his research includes continuing collaboration with stakeholders to develop practical tools and prototypes, as well as developing a better theoretical understanding of how to ‘prebunk’ information operations, and how to assist those who slide into misinformation narratives.