Technical University of Denmark: Eye movements and machine learning reveal signs of illness
The ability to make eye contact with other people is an important part of our social interaction with each other. Among other things, we read joy, trust, and interest through eye contact, but we also look at the nose, mouth, and the other elements of the face in a dynamic process.
Researchers have found that people with autism and other neurological or psychiatric disorders tend to look at faces in a slightly different way and for these people, it can be difficult to keep eye contact. It is possible to measure these differences using sensors, the so-called eye-trackers. Researchers from DTU have found a new and – in their words – more general way to examine data from eye-trackers.
“Our data-driven method is based on machine learning and can contribute to the diagnosis of patients and to determining whether a treatment or – further down track – newly developed medicine has the desired effect,” says Paolo Masulli, a former postdoc at DTU Compute and now employed by iMotions, who has been involved in developing software for data analysis and modelling of the biometric data.
The research result is part of a recently completed project supported by the Innovation Fund Denmark, and it was published a few weeks ago in the international neuroscience journal Cortex.
Model analyzes heatmap
When patients are examined for neurological or psychiatric disorders using eye-tracking, researchers often present the patients with a series of photos or videos with faces on a screen. The eye-tracking sensor then tracks the patient’s eye movements and keeps track of specific areas in the images. The results can be seen on a so-called heatmap; The amount of time patients spend looking at certain areas of the image will affect the colour or the heatmap.
“… There is no exact science that defines the size and location of these fields in the images. There is a lot of subjectivity in it. Our method is different. We do not define specific areas in advance, but let the data speak and our method thereby provides us with a more objective assessment of the patient’s eye movements.”
Paolo Masulli, former postdoc at DTU Compute
“Researchers typically define and frame which areas are of interest. However, there is no exact science that defines the size and location of these fields in the images. There is a lot of subjectivity in it. Our method is different. We do not define specific areas in advance, but let the data speak and our method thereby provides us with a more objective assessment of the patient’s eye movements,” says Paolo Masulli.
DTU has had access to eye-tracking data from 111 outpatient psychiatric patients, which the Swedish university partner, Gillberg Neuropsychiatry Center, has recorded. Patients aged 18 to 25 showed symptoms of autism, depression, or ADHD and wanted to participate in the research project and make their anonymized data available for research.
In the trial, patients answered some standard clinical tests, which place them on numerical scales according to the severity of their symptoms. They were then presented with a series of black-and-white images in which the person in the image expresses joy, anger, or looks neutral, while the eye-tracker collected data from the entire image. For example person 1 spent more time looking at the left eye, person 2 looked spent time looking all around it. This has resulted in the generation of a heatmap from each patient.
Subsequently, all heatmaps were analyzed using machine learning, where the most important components (points on the face) were statistically identified from the entire data set and thus without selecting data from specific areas, as the conventional method does.