University of the Free State contributes to elections through social listening for health impact

As we approach the exercise of our right to vote on 29 May 2023, the role of social media in elections cannot be overstated, as it not only amplifies voices but also influences perceptions, sways opinions, and mobilises movements. Each platform offers a unique ecosystem – from Twitter (X) to Facebook, Instagram, WhatsApp, and TikTok.

The University of the Free State (UFS) Interdisciplinary Centre for Digital Futures (ICDF) will be among other research groups, such as the Data Science for Social Impact Research Group (DSFSI), the Council for Scientific and Industrial Research (CSIR)Research ICT Africa, and the NLP Research Group, diligently monitoring various issues to address the authenticity of information and misinformation on different digital platforms.

“The ICDF will be focusing on social media and health analysis. To disseminate the insights from our findings, we have created the Social Listening, Health Intelligence, and Pandemic Economics (SLHIPE) Misinformation Portal. The SLHIPE (also pronounced as ‘Sleep’) portal categorises health misinformation according to its level of harm. Included in this portal is the health misinformation that can potentially impact health-seeking behaviour, information-seeking behaviour of people, as well as valuable insight into the public perception of health concepts and concerns such as the National Health Insurance (NHI) and infectious disease,” said Prof Katinka de Wet, Interim Co-Director of the ICDF.

Impact on various organisations and departments

Prof De Wet said that the data gathered from the portal follows the social listening methodology, which is benchmarked according to the World Health Organisation’s (WHO) guidelines featuring in the national Risk Communication and Community Engagement (RCCE), as well as other public health communication structures in the provincial and National Department of Health.

“This data enables postgraduate studies and academic insight into social media posts that may cause potential harm to health, health-seeking behaviour, and information-seeking behaviour. Once the content is identified and captured in the database, this data is shared among the research group to understand the broader context of misinformation. Potential harm is then classified and escalated to the relevant authorities (such as Africawatch, Africa Check, and Real411, among others) within the research group. Once the content is shared, the necessary procedures can be followed to mitigate the potential harm.”

The collaborative effort between research groups from academia and the public sector underscores a commitment to leveraging digital tools to improve societal discourse and democratic practices – an integral facet of the UFS’ Vision 130, aimed at promoting impact across diverse communities, this time extending to the nation.