Rice Students Design Augmented Reality System for NASA Spacewalks

As the NASA-led Artemis program gears up to take a crewed flight beyond the Moon, innovators are hard at work developing the technologies needed for successful exploration and science missions. That includes a student team from Rice University’s augmented reality (AR) and virtual reality club.

The team participated in the NASA Spacesuit User Interface Technologies for Students (NASA SUITS) program, a design challenge that invites college students from across the country to help produce user interface solutions that could be part of future spacesuits. This year’s challenge was to create an AR display for astronauts on lunar and Martian spacewalks, along with a web browser user interface for local mission control.

“As humanity pushes further into space, it’s essential that crew members on spacewalks be equipped with the appropriate new technologies necessary for the elevated demands of surface exploration on the moon and Mars,” said senior Michelle Zheng, lead team member who is majoring in electrical and computer engineering.

Enhancing spacesuits

After submitting their concept to the program in fall 2023, only 11 teams including Rice were selected to develop their prototype. The Rice team tested its device in a mock extravehicular activity (EVA) scenario May 19-23 at Lyndon B. Johnson Space Center, showcasing its work to NASA as well as university and industry partners.

During EVAs or spacewalks performed outside the spacecraft, astronauts wear spacesuits and exit the lander through an airlock to perform tasks such as exploring surface terrain, conducting research and interacting with various payloads and lunar assets, including life support systems, rovers and habitats.

For future spacewalks, the team designed an AR display to streamline these routine tasks, Zheng said. “Our design also aims to be a memory aid to reduce the amount of information that the astronauts must remember and reduce overall errors during the EVA process,” she added.

Communicating with mission control

Currently, NASA’s Mission Control Center in Houston communicates with astronauts via a voice loop. But future missions to Mars will face communication delays of up to 20 minutes, necessitating greater crew autonomy.

The team’s AR system prioritizes usability and just-in-time instructions via a dual-display system to enhance operational effectiveness for a more efficient web browser user interface from mission control. Key features include an intuitive user interface, real-time instructions to ease cognitive load and a dual-display system for improved task management.

Part of revolutionizing spaceflights

Although there are no prizes given at the Johnson Space Center event, Rice’s team took home something more, said Philip Kortum, faculty adviser and professor of psychological sciences. “By participating in the NASA SUITS challenge, we got to play a small part in working toward revolutionizing the human spaceflight experience,” he said.

The Rice team also includes Jasmine Manansala, master’s student in computer science; Shrreya Aagarwal and Melissa Cloutier, doctoral students in psychological sciences; Benjamin Rubin, junior majoring in computer science and mathematics; Yining Zhang, doctoral student in computer science and psychological sciences; Mert Culcu, sophomore majoring in mechanical engineering; and junior Daniel Kuo, sophomore Chloe Park and freshman Justin Lee, who are majoring in computer science.

Their related research paper, “Augmented Reality in Extravehicular Activities: Optimizing Alert Detection and Cognitive Workload,” has been accepted for presentation at the Human Factors and Ergonomics Society International Annual Meeting Sept. 9-13 in Phoenix.