Technical University of Munich: Bird’s-eye view improves safety of autonomous driving
In the Providentia++ project, researchers at the Technical University of Munich (TUM) have worked with industry partners to develop a technology to complement the vehicle perspective based on onboard sensor input with a bird’s-eye view of traffic conditions. This improves road safety – also for autonomous driving.
The expectations for autonomous driving are clear: “Cars have to travel safely not only at low speeds, but also in fast-moving traffic,” says Jörg Schrepfer, the Head of Driving Advanced Research Germany at Valeo. For example, when objects fall off a truck, the “egocentric” perspective of a car will often be unable to detect the hazardous debris in time. “In these cases, it will be difficult to execute smooth evasive action,” says Schrepfer. Researchers in the Providentia++ project, which has received support over the past five and a half years from the Federal Ministry for Transport and Digital Infrastructure (BMDV) under a funding program for automatic and networked driving, have developed a system to transmit an additional view of the traffic situation into vehicles. “Using sensors on overhead sign bridges and masts, we have created a reliable real-time digital twin of the traffic situation on our test route that functions around the clock,” says Prof. Alois Knoll of project lead manager TUM. “With this system, we can now complement the vehicle’s view with an external perspective – a bird’s-eye view – and incorporate the behavior of other road users into decisions.”
Transmitting the digital twin into the car: minimizing time lags
This is far from trivial: the digital twin needs to know the exact location of the vehicle into which the sensor station information is transmitted. To make this possible, the project partner Valeo used an IMU-GNSS system (Inertial Measurement Unit – Global Navigation Satellite System) consisting of a measurement unit, a satellite navigation system and a real-time kinematic kit. “In this way, we create a coordinate system in real time that is accurate to the nearest centimeter,” explains Valeo expert Jörg Schrepfer. To synchronize the information from the vehicles and the measurement stations for the digital twin, the researchers use the UTC standard, which provides a uniform basis for coordinating time. Ideally, the digital mapping would be superimposed like a second layer over the car’s perspective. However, time lags (latencies) in the overall system cannot be entirely avoided. From the physical detection by the sensors and the processing of the data to the radio transmission to the vehicle, time passes. Data are packaged, coded and transmitted and then decoded in the car. Other conditions play a role, too, such as the distance of the vehicle from the transmitter tower on the test route and the traffic volume on the data transmission network. In a recent demonstration run, Valeo worked with the LTE (4G) wireless standard, which caused latency of 100 to 400 milliseconds. “These latencies can never be completely eliminated. However, intelligent algorithms will help,” explains Schrepfer: “The results will be even better in the future when we have full coverage with the 5G or 6G telecommunications standards.”
Prototype available for real-time digital twin
The Providentia++ research project has created the conditions for using these data in the vehicle. The goal was to create a scalable and highly available digital twin of the traffic situation with real-time capability. For this purpose, the team built a 3.5 kilometer test route in Garching, just outside Munich, consisting of seven sensor stations. The prototype was developed to permit series implementation if needed:
the researchers are working with decentralized digital twins. This permits the test route to be scaled up or extended to any desired length.
To handle data volumes of several gigabytes per second, they created a data processing concept that optimizes the load distribution across multiple CPUs and graphics cards (GPUs).
Special programming challenges were posed by the calibration of sensors and the development of the tracking algorithms – tasks for which no software existed. “We are now using an automatic calibration process based on a high-resolution roadmap (HD map). It did not previously exist, so we had to develop it,” explains technical project leader Venkatnarayanan Lakshminarashiman from the TUM Chair of Robotics, Artificial Intelligence and Real-time Systems.
Consortium leader Prof. Alois Knoll from TUM is highly satisfied with the results: “The digital twin is ready for the project development stage. The concept is working reliably in 24/7 operations and is suitable not only for highways, but also for secondary roads and around intersections.”