University Of Maryland To Convey Big Data With Sound And Touch

0

From functioning effectively at work to keeping up with the news, modern society increasingly expects people to not only be data-literate, but also data-savvy. Charts and infographics that can make a sea of numbers tangible often present a barrier for blind users, and existing accessibility tools like screen readers cannot yet provide an overview of large data sets or charts, let alone in-depth searches.

Supported by $433,000 from the National Science Foundation, two University of Maryland researchers are turning to sound and touch to help people analyze large-scale data.

“The fundamental impact that this work will have on the millions of people in the United States and around the world who are blind or have low vision cannot be overstated,” said Niklas Elmqvist, who is leading the two-year project with fellow College of Information Studies Professor Jonathan Lazar. Elmqvist, who also has an appointment in the University of Maryland Institute for Advanced Computer Studies, adds that this research will have a broad impact on the general population, considering most people experience vision loss as they age.

As the role of big data continues to grow in scale and significance, the inaccessibility gap could also increase unless researchers intervene, they said. The pandemic highlighted this urgent need—with one study showing that half of blind users rely on help from sighted people to access vital data about COVID-19.

The UMD team will work with the blindness community and technology organizations to assess popular accessibility tools and contexts, conducting the research through the lens of two real-world settings that require large datasets—higher education and employment.

Their overarching goal is to develop high-bandwidth data representations based on sound, touch, and physical computing that will enable blind users to view, analyze and understand large datasets as easily as sighted users.

The underlying approach to this work is called “sensory substitution,” or using general assistive technology to functionally sub in one sense with another, said Lazar, who is the director of UMD’s Trace Research and Development Center, which works to improve accessibility in technology.

For example, Lazar and other UMD faculty are known for a tool they built called iSonic, which creates an audio version of a map that allows a user to execute sweeps from left to right to hear various pitches that represent the data.

Elmqvist also has extensive experience in this area—he helped lead the development of an essential oil diffuser that conveys data through smell, and is currently working with the Maryland Department of Education to create computer science data course that’s accessible for blind high school students.

As Elmqvist discussed at a community TEDx talk at Montgomery-Blair High School in Silver Spring, Md., many current sensory substitution techniques are not scalable because they take too much time to create, are too costly or not widely available.

Therefore, the iSchool professors say a fundamental goal will be designing inexpensive solutions that are compatible with blind users’ existing software and equipment.