App developed uses AI tracks caloric intake for cancer patients
Sylvester Comprehensive Cancer Center specialists, in collaboration with University of Miami innovators, have developed a new app that visually identifies and digitally catalogs food intake, a critical advancement for cancer patients whose caloric intake is often fundamental to their recovery.
One of the first questions cancer patients ask of their doctors is what they can do—especially in terms of modifying diet—to support their recovery. Yet despite their eagerness to help, when asked to monitor their critical food intake, patients are often at a loss to remember in detail and fail to chart consistently.
The “digital assistant to estimate caloric intake” app, developed and powered by artificial intelligence protocols, holds tremendous promise to resolve this dilemma and improve patients’ recovery.
Dr. William Jin, a staff physician with the University of Miami Medical Group, and Dr. Brandon Mahal, a cancer specialist who serves as assistant director for community outreach with Sylvester, teamed with Max Cacchione, director of innovation with University of Miami Information Technology, and several student researchers during the past six months to develop the “Calorie Checker: An Image Analysis App for Caloric Estimation in Real Time.”
“This app can help us capture folks’ diets and to assess which diets might be potentially healthy and protective against certain kinds of cancer and others that might increase risk factor and tend toward adverse outcomes,” explained Mahal. “This has immense potential for our community and the patients we serve.”
Mahal credited Jin with generating the original idea.
“I’ve been super interested for a long time in understanding how diet and exercise—both modifiable behaviors—impact prostate cancer incidence and development and the way they respond to treatment,” said Jin, a radiation oncologist specialist.
Yet as a researcher, he was less than satisfied with the existing methodologies.
“The way the research is normally done is that you ask for dietary recalls. Yet you and I know that it’s hard to remember what you ate yesterday, last week, let alone last month,” Jin said. “And the most common questions we would get from patients in our clinic was: ‘Does diet impact my recovery?’ ‘Is there anything I shouldn’t be eating?’ ”
Jin sensed that emerging technologies could improve and accelerate the process to quantify and qualify diet.
His supposition was motivated in large part by his initial medical research project years ago where he was tasked with counting the number of dots that showed up on a slide, which represented the number of axons, or nerve fibers, in a cell.
“My job was to do this for 40 hours a week, and the project would take a year,” Jin remembered. His roommate, a software engineer, offered to write script to automate the process.
The assistance prompted him later, when he was in residency, to build out a patent for an idea that you could take the camera view, whether augmented or mixed reality, and send the information visually to a processor for the computer to then generate meaningful information.
In 2019, he learned of Cacchione’s involvement as director of Innovate, a group which supports and implements innovative technology initiatives across the University, and reached out via email as his idea began to take shape. The two soon began to collaborate.
Cacchione noted that, months into the development, the team has passed its biggest hurdle: training the data.
“Training the data to identify food takes a long time and is very laborious,” he noted. Writing a few scripts helped. As did the participation of three Innovate students, a master’s degree student and two doctoral students.
Cacchione explained that the image recognition data could be monetized using blockchain-based protocols—especially one called Ocean—that allows end users to buy and sell AI training data directly, via smart contract, bypassing data brokers. In the future, this typically long training data step can be shortened by these new technologies.
Initially the software was able to identify only a few items: sushi, a sandwich, and French fries. But by early April, nearly 4,000 images had been integrated into the knowledge base.
“Now we’re connecting it all to the U.S. Department of Agriculture database so that whenever you scan something, it knows exactly what’s in it from a protein, fat, and carbohydrate standpoint,” he explained.
While the technology brings advantages for all, children in particular will benefit.
“Everyone has a phone, even children, so they can use their phones or their app to capture their nutrition automatically and keep track of it,” Cacchione said. He added that the app is very accessible in the sense that you don’t need an expensive AR device—the app works on almost any cell phone.
“This is a good example of AI put to good use, saving people’s lives by helping in the remission of cancer,” Cacchione said. “Dr. Jin and Dr. Mahal are really pushing the train in the right direction.”
The Calorie Checker app is now in a prototype stage, and the team expects to advance to testing in clinical trials soon.