The Relationship Between ChatGPT and Rehabilitation, A Subject Requiring Further Investigation
Whether it’s a question of analyzing medical images, detecting drug interactions, or creating brain-computer interfaces, it seems like the potential applications of artificial intelligence (AI) in the healthcare industry are endless.
Could an AI chatbot like ChatGPT be a useful tool for healthcare professionals?
More specifically, could ChatGPT’s new features, which can decipher images and speak fluently with others, allow it to be used in rehab?
Already part of our lives
These technologies are already part of our lives and could help people overcome various disabilities in several ways, according to Dahlia Kairy and Joseph Omer Dyer, physiotherapy professors at Université de Montréal’s School of Rehabilitation.
They believe that ChatGPT’s voice-command feature could make it easier for people with mobility or sensory impairments to communicate and access information. Individuals recovering from a brain injury or stroke could also use these technologies’ vocal and linguistic abilities to have a conversation.
According to Dyer, ChatGPT could be used for to plan agendas, book appointmens and figure out when different kinds of medication should be taken. Similarly, Kairy thinks ChatGPT could suggest exercises that suit each patient’s unique needs (socioeconomic situation, age, and physical, mental and neurological condition) and explain how and why they should be performed.
“In short, patients could use ChatGPT to free them from some of the restrictions imposed by their condition, as well as to find information they need,” said Dyer. “But anyone who uses this software needs to think critically, in addition to understanding the nitty-gritty of how it works and any risks it may pose.”
Need to be cautious
Both professors agree that we still need to be cautious and make sure ChatGPT doesn’t have any unexpected repercussions. “We’re flying blind, said Dyer. We don’t have any conclusive data to show that ChatGPT can help or at least prove that it’s harmless,” added Kairy.
As a physiotherapist and researcher at UdeM’s Groupe interdisciplinaire de recherche sur la cognition et raisonnement professionnel, Dyer is concerned about the ethical and professional issues raised by ChatGPT.
“If I share personal information with it to make a diagnosis or a decision, am I violating my patient’s confidentiality?” he asked. “And if I suggest a patient use it to overcome their disability, I don’t have any scientific data that confirms it will actually be useful. I also don’t know what the side effects could be.”
Kairy, a researcher with the Centre for Interdisciplinary Research in Rehabilitation of Greater Montreal (CRIR), wonders whether ChatGPT could keep patients from progressing with their rehab because they would no longer be forced to gradually develop their physical skills.
“Since the AI ultimately ends up doing certain things for us, we could permanently lose the ability to do them ourselves,” suggested Kairy, who’s also an expert in telerehabilitation. “We still don’t have any evidence, but we have reasonable grounds to believe that, at least neurologically speaking, it really is a question of use it or lose it.”
The professors also warned that using ChatGPT for rehab could lead to issues such as digital addiction or social isolation.
More research needed
Before using ChatGPT in a clinical setting, Kairy and Dyer want to be sure that regulations are in place to protect privacy and ensure that the software is used ethically. They also want scientific evidence that it is in fact useful.
They’re counting on universities to research these questions, dig deeper and eventually answer them. And they hope colleagues from other fields will work on the issue.
“As academics, our role is to put our tools and resources to good use in our respective fields to move science and society forward,” said Dyer.