Eindhoven University of Technology: MachyTech enables humans and robots to work safely together
Autonomous robots. They are driving around more and more in, for example, at the sorting centers of companies. They lift objects out of boxes, move them around and put them down again. The robots perform the heavy, sometimes monotonous tasks so that the scarce labor force can be used elsewhere. Convenient and efficient. But, robots do not communicate with their surroundings. Will they turn left, turn right, or is something wrong? The MachyTech student team wants to fix that problem with the help of deep learning, projections, LiDAR systems and voice recognition, among other things.
Human aspects
“Our goal is to enable robots and humans to work together safely,” says Martijn Stoorvogel, co-founder of MachyTech. “Robots can take a lot of work off your hands. But there’s also something slightly scary about that. We want to combine that technical world with the human aspects.”
He started the student team together with Timo Thans. The two are friends and have known each other since birth. Their fathers both studied physics at the Eindhoven University of Technology ( TU/e), became friends and kept in touch after their studies. Like their fathers, Stoorvogel and Thans opted for a technical study, Stoorvogel for biomedical engineering at TU/e, and Thans for electrical engineering at TU Delft. Both have since finished their bachelor’s degrees. Stoorvogel chose a gap year and Thans is working on his masters.
It is actually a project that has gotten a bit out of hand, says Stoorvogel. “Timo had taken on an assignment from the Prime Vision company as a final project for his bachelor’s degree. This company builds autonomous robots, and Timo was tasked with researching how robots can communicate with the surrounding environment.” As soon as he started working on his assignment, Thans noticed that more was involved than just research, says Stoorvogel. “He wanted to take it further and asked me to join him.”
America
They set up a student team, MachyTech, and joined TU/e innovation Space. “We could use some help. That’s how our project got started.” At innovation Space, Stoorvogel and Thans followed a course to set up their business plan. They also wanted to get in touch with other students there so they could grow as a team. However, there was no time for any of that just then. The two were given the opportunity to join Prime Vision for three months in the United States.
Prime Vision has several sorting centers in America as customers. Stoorvogel and Thans went along with them to install the robots. But also to test out their ideas and prototype. The trip gave them a clear picture of how things work in practice.
“Those robots sort the packages, they drive around and know what needs to go where. The only problem is that those robots drive around, but they don’t let you know if they’re going to go left or right or stop. It was in America that our project really got off the ground.”
Robots can take a lot of work off your hands. But there’s also something slightly scary about that. We want to combine that technical world with the human aspects.
Room service
Prime Vision’s robots are currently driving around mainly in sorting centers. Stoorvogel expects that robots will play a broader role within society in the future. “For example, in a hotel to bring the towels to the hotel rooms, or to replace room service. Then it’s nice to know if that robot can see you and knows where it’s going.”
Inside a black box of about fifteen inches in size is housed a small computer. Attached to that box is a camera. All of it can be put on the robot. “As a kind of accessory,” says Stoorvogel. “The robot knows its path, that’s how it is programmed. There’s a projector inside our box that projects that path onto the ground as a one- or two-meter trajectory. That’s how an employee sees where the robot is going.”
Another component of the prototype is the camera. “Deep learning algorithms are behind that,” Stoorvogel points out. An algorithm receives the images via a live stream and analyzes them. Two algorithms are running at the moment. One of these recognizes people. If they stand in its path, the robot stops or finds another path. The robot uses the other algorithm to communicate through signs. For example, raising a hand is a stop sign, and a thumbs up means ‘go ahead’.
Enrichment
America was a great learning experience,Stoorvogel says. “We could work on our own parts on the spot and apply them in real life.” Among other things, the two discovered that what they had figured out behind the computer did not always work that way on the work floor.
“In any case, the computer, which is inside that black box, has far less computational power than the computer in my laptop. So when you run the algorithms on such a small computer, other things happen. We did run into that a few times.”
The two also discovered the importance of being able to control the small computer through an interface. “So that, for example, you can easily turn on the camera and see what’s going on inside that robot.” So, along came an app with a live stream projection.”
Stoorvogel is likely to continue his studies after his gap year. But first he will spend three months backpacking through South America. “We are definitely looking for new people. An industrial engineer, an electrical engineer or a data analyst. It doesn’t really matter at all. That’s the great thing about a student team: you can also get involved in something you are not studying; it’s just an enrichment.”