Study Aims to Eliminate Gender Bias in Future AI Healthcare Systems
Researchers are setting out to help ensure that the artificial intelligence behind the healthcare monitoring systems of the future is capable of providing the best possible care for both men and women.
Recent advances in radar sensing technology could underpin a new generation of vital sign monitoring, experts say.
A number of AI-enhanced vital sign monitoring systems, including the University of Glasgow’s £5.5m Healthcare QUEST, are currently in development.
The projects are exploring the potential of cutting-edge sensors to keep track of the delicate rhythms of patients’ hearts and lungs without requiring them to wear monitoring devices or be tracked on video cameras.
These less-invasive vital sign monitoring systems will be supported by artificial intelligence technology. The AI will spot the signs of an unexpected change in heart rate or respiration. If it decides medical intervention is required, it can send an alert for help.
The technology could help vulnerable groups like older people live more independently at home or in assisted accommodation. It could provide additional insight into the wellbeing of patients staying in hospital wards.
A critically important consideration for any future radar-based health monitoring system is ensuring that its artificial intelligence component is properly trained and equally capable of making the correct judgements without bias towards one gender of patients.
A team from the University of Glasgow’s James Watt School of Engineering have won new funding for a project that will examine the potential for gender bias in healthcare AI and find ways to ensure that AI-supported treatment remains equitable.
Dr Nour Ghadban, a research fellow in electronic and nanoscale engineering at the University of Glasgow, is the project’s principal investigator. She said: “New sensors linked with artificial intelligence could offer potentially transformational opportunities to improve the way that we monitor patient wellbeing.
“However, we can only reap those benefits if we can be sure that the AI systems, we use to achieve them are up to the task. We know that all kinds of human bias across race, class gender and more can be unwittingly incorporated into AI decision-making tools if the proper care isn’t taken when they are being trained on real-world data.
“It’s vitally important that we try to tackle these potential issues as early as possible to ensure that patient safety can be guaranteed, and male and female patients will receive the same high quality of care.”
Over the course of the next 18 months, we will work to develop a new framework to balance gender-related behaviour in an AI monitoring system.
The research team will collect healthcare data from 30 male and 30 female study volunteers using radar sensors. The data will be used to train a newly developed AI architecture, which will analyse the results of the radar monitoring.
Separate models will be trained on the male and female data, comparing performance and highlighting any biases in the AI’s performance, which can be adjusted for using statistical models and mitigation techniques.
Dr Julien Le Kernec, from the James Watt School of Engineering, is the project’s supervisor. He said: “By the end of our research, we expect to have created a robust demonstration of how gender balance can be embedded in the heart of any future AI healthcare monitoring systems.
“We hope the outputs from this project will help guide the future development of this very promising technology and ensure that the AI which supports it is fair and balanced for the many diverse groups of patients who will benefit from it.”
The project is supported by €9,500 (£8,200) in new funding from the Women and Science Chair at Université Paris Dauphine-PSL, supported by of the L’Oréal Foundation, Generali France, La Poste, Amundi and the Talan Group.
Dr Ghadban, originally from Syria, joined the University of Glasgow’s James Watt School of Engineering in August 2022 with the support of the Cara Fellowship Programme. Cara (the Council for At-Risk Academics) provides urgently needed help to academics in immediate danger.
The team leading the research are Dr Nour Ghadban, Mostafa Elsayed, Prof Jonathan Cooper and Dr Julien Le Kernec.