“Alexa and those sorts of tools are often presented as the ideal speech interaction but that’s a very limiting view,” says Axtell, who works with Cosmin Munteanu in U of T’s Technologies for Ageing Gracefully lab.
“I wanted to look at what we are not getting, and thought, ‘You know who’s been talking to computers since the ‘80s? Star Trek.’”
As examples, Axtell points to automatic doors at supermarkets, flip phones, and touchscreen tablets as some of the real-world tech inspired by the popular sci fi television show.
“(Star Trek) is what we imagined as our ideal speech interaction,” says Axtell, who watched both Star Trek: The Next Generation and the original 1960s series in syndication with her family growing up.
“So, we should at least investigate it as a starting point to think about where we are going next.”
The study is an offshoot of Axtell’s master’s thesis, which examined how seniors interact with digital space. As digital assistants like Alexa (Amazon), Siri (Apple), Cortana (Microsoft) and Bixby (Samsung) become more common, Axtell says there is a hope that such technologies can be made more accessible to those who are less computer literate, including older adults.
To supervise her research, Axtell turned to her lab co-director Munteanu – a fellow sci-fi fan.
“We are the Trekkies in the lab,” says Munteanu, an assistant professor at U of T Mississauga’s Institute for Communication, Culture, Information and Technology whose area of expertise is human-computer interaction. “When Benett floated this idea, I said, ‘Sure, let’s do this.’”
It wasn’t an excuse to watch TV, however. Instead, Axtell chose to work from transcripts of the show, reading 69,355 lines of dialogue in all.
Although the original 1960s Star Trek series also shows characters interacting with computers using voice commands, the researchers decided to focus on Star Trek: The Next Generation, which aired from 1987 to 1994 just as home computers and the internet were becoming mainstream.
Axtell divided the 1,372 individual exchanges between the show’s characters and the ship’s computers (including those operating the turbolift, holodeck and replicators) into seven categories: command, question, statement, password, “wake-up” words, comments and conversation.
The variety of interactions yielded a dataset similar to one that could be captured from in-the-wild conversations, according to Axtell.
“It’s such a huge series, over 100 episodes, 45 minutes each,” she says. “If we had just looked at one movie, it would be just [asking computers to do] really interesting things. With Star Trek, you also get mundane, practical things – like making a cup of tea.”
The data revealed 95 per cent of interactions with the computer are brief and functional, not conversational.
“One of the big pushes in interaction like Alexa and Google Home is to get it as close to human conversation as possible, but maybe that’s not what we really want,” Axtell says. “People on the Enterprise aren’t having conversations with the computer. It’s very targeted interactions: They say ‘Deck Five’ in the turbolift – and they don’t say ‘thank you’ or ‘please.’
“It’s not an equal back and forth. They just get on with it.”
Axtell also compared how Star Trek characters and Alexa users interact with technology.
“They lined up amazingly well,” she says. “Entertainment is big for both – playing music, using the holodeck or VR – or smart home things like making tea, starting your car or adjusting your thermostat. What the (Enterprise computer) can do that we can’t is heavy lifting analysis, like scanning an alien life form. We aren’t there yet.”
That’s the downside to growing up seeing the utopian technology of Star Trek, Munteanu says.
“There’s an expectation that interfaces will work beyond their capabilities,” he says. “It gives us the dream of what is possible, but also a little disappointment that we aren’t there yet because even if someone is not a Trekkie, [the Star Trek influence is] still in the media exposure of what can be done, and what we are inspired to do when we work in these spaces.”
Axtell says this study was a high level look at how Star Trek uses their VUI, but there are plenty of other facets to explore in the data set, so they are sharing it online for others to explore.
In May, Axtell and Munteanu will virtually present their research at the Association of Computing Machinery’s Chi 2021 conference.