University of Washington: A smartphone’s camera and flash could help people measure blood oxygen levels at home
First, pause and take a deep breath.
When we breathe in, our lungs fill with oxygen, which is distributed to our red blood cells for transportation throughout our bodies. Our bodies need a lot of oxygen to function, and healthy people have at least 95% oxygen saturation all the time.
Conditions like asthma or COVID-19 make it harder for bodies to absorb oxygen from the lungs. This leads to oxygen saturation percentages that drop to 90% or below, an indication that medical attention is needed.
In a clinic, doctors monitor oxygen saturation using pulse oximeters — those clips you put over your fingertip or ear. But monitoring oxygen saturation at home multiple times a day could help patients keep an eye on COVID symptoms, for example.
In a proof-of-principle study, University of Washington and University of California San Diego researchers have shown that smartphones are capable of detecting blood oxygen saturation levels down to 70%. This is the lowest value that pulse oximeters should be able to measure, as recommended by the U.S. Food and Drug Administration.
The technique involves participants placing their finger over the camera and flash of a smartphone, which uses a deep-learning algorithm to decipher the blood oxygen levels. When the team delivered a controlled mixture of nitrogen and oxygen to six subjects to artificially bring their blood oxygen levels down, the smartphone correctly predicted whether the subject had low blood oxygen levels 80% of the time.
The team published these results Sept. 19 in npj Digital Medicine.
“Other smartphone apps that do this were developed by asking people to hold their breath. But people get very uncomfortable and have to breathe after a minute or so, and that’s before their blood-oxygen levels have gone down far enough to represent the full range of clinically relevant data,” said co-lead author Jason Hoffman, a UW doctoral student in the Paul G. Allen School of Computer Science & Engineering. “With our test, we’re able to gather 15 minutes of data from each subject. Our data shows that smartphones could work well right in the critical threshold range.”
Pulse oximeters on a table with a smartphone. A person’s hand is covering the smartphone’s flash and camera and the flash is making the finger glow red
One way to measure oxygen saturation is to use pulse oximeters — those little clips you put over your fingertip (some shown here in gray and blue).Dennis Wise/University of Washington
Another benefit of measuring blood oxygen levels on a smartphone is that almost everyone has one.
“This way you could have multiple measurements with your own device at either no cost or low cost,” said co-author Dr. Matthew Thompson, professor of family medicine in the UW School of Medicine. “In an ideal world, this information could be seamlessly transmitted to a doctor’s office. This would be really beneficial for telemedicine appointments or for triage nurses to be able to quickly determine whether patients need to go to the emergency department or if they can continue to rest at home and make an appointment with their primary care provider later.”
The team recruited six participants ranging in age from 20 to 34. Three identified as female, three identified as male. One participant identified as being African American, while the rest identified as being Caucasian.
To gather data to train and test the algorithm, the researchers had each participant wear a standard pulse oximeter on one finger and then place another finger on the same hand over a smartphone’s camera and flash. Each participant had this same set up on both hands simultaneously.
“The camera is recording a video: Every time your heart beats, fresh blood flows through the part illuminated by the flash,” said senior author Edward Wang, who started this project as a UW doctoral student studying electrical and computer engineering and is now an assistant professor at UC San Diego’s Design Lab and the Department of Electrical and Computer Engineering.
“The camera records how much that blood absorbs the light from the flash in each of the three color channels it measures: red, green and blue,” said Wang, who also directs the UC San Diego DigiHealth Lab. “Then we can feed those intensity measurements into our deep-learning model.”
Each participant breathed in a controlled mixture of oxygen and nitrogen to slowly reduce oxygen levels. The process took about 15 minutes. For all six participants, the team acquired more than 10,000 blood oxygen level readings between 61% and 100%.
The researchers used data from four of the participants to train a deep learning algorithm to pull out the blood oxygen levels. The remainder of the data was used to validate the method and then test it to see how well it performed on new subjects.
“Smartphone light can get scattered by all these other components in your finger, which means there’s a lot of noise in the data that we’re looking at,” said co-lead author Varun Viswanath, a UW alumnus who is now a doctoral student advised by Wang at UC San Diego. “Deep learning is a really helpful technique here because it can see these really complex and nuanced features and helps you find patterns that you wouldn’t otherwise be able to see.”
The team hopes to continue this research by testing the algorithm on more people.
“One of our subjects had thick calluses on their fingers, which made it harder for our algorithm to accurately determine their blood oxygen levels,” Hoffman said. “If we were to expand this study to more subjects, we would likely see more people with calluses and more people with different skin tones. Then we could potentially have an algorithm with enough complexity to be able to better model all these differences.”
But, the researchers said, this is a good first step toward developing biomedical devices that are aided by machine learning.
“It’s so important to do a study like this,” Wang said. “Traditional medical devices go through rigorous testing. But computer science research is still just starting to dig its teeth into using machine learning for biomedical device development and we’re all still learning. By forcing ourselves to be rigorous, we’re forcing ourselves to learn how to do things right.”