Tomorrow’s automotive navigation will be highly connected, integrating real-time, high-resolution maps, and cloud-based vehicle and environmental data. This enables advanced driver assistance, intelligent e-mobility, and autonomous driving.
Though easier to obtain today via dashcams, drones, and satellites, mapping data collection remains labor-intensive. Despite extensive existing geographical information systems, maps require regular updates to maintain accuracy.
The biggest challenges are data accuracy, timeliness, and coverage, as the physical world constantly changes. The evolution of navigation and digital mapping is accelerating to address these, driven by five key technology trends.
1. Enriching mapping data with AI
A breakthrough for map creation, satellite imagery has been impacted by the fact that mapping software cannot work directly with satellite photos. Visual data first needs to be codified into comprehensive navigation datasets in a suitable format, such as the Navigation Data Standard. For map owners, keeping up to date is cost- and labor-intensive, establishing great use cases for AI in mapping.
AI algorithms improve the speed and precision of digital map-building, updating maps more regularly while mapping new areas faster. They can classify objects in satellite images—buildings, roads, vegetation—to create enriched 2D digital maps as well as multi-layer 3D map models. The result is more precise ETA predictions, detailed fuel or energy usage estimates, and greater point-of-interest information.
AI can also help with generating mapping data. Researchers from MIT and the Qatar Computing Research Institute recently released “RoadTagger,” a neural network that can automatically predict road type (residential or highway) and number of lanes even with visual obstructions. Tested on occluded roads from digital maps of 20 U.S. cities, it accurately predicted road types with 93% accuracy and the number of lanes with 77% accuracy.
However, sensor data collection from connected vehicles is still critically important. OEMs are increasingly relying on their fleets to collect new insights for digital map creation, a process that is becoming easier with advances in machine learning. Recently, HERE Technologies presented its new UniMap, an AI-driven technology for faster sensor data processing and map creation. This new solution can effectively extract map features in 2D and 3D formats and combine them with earlier map versions. The result? New digital maps can become available in 24 hours.
2. NDS.Live: From offline databases to distributed map data systems
The new global standard for map data in the automotive ecosystem, NDS.Live promotes the transition from offline to hybrid/online navigation, minimizing the complexities of supporting different data models, storage formats, interfaces, and protocols with one flexible specification. NDS.Live is a distributed map data system, not a database.
Co-developed by global OEMs and tech leaders (including Intellias), NDS.Live has already been adopted by Daimler, HERE, Denso, Renault, and TomTom. For example, second-generation Mercedes-Benz User Experience systems are powered by NDS. Live. The distributed map data system provides fresh information for the driver assistance system, which gets visualized as augmented reality instructions on the head-up display. NDS.Live significantly improves the navigation experience for electric vehicles and regular connected vehicles while helping OEMs deploy value-added subscriptions for assisted driving and navigation.
See also: Research paves the way for ADAS in trucking
3. 3D and HD map generation
Three-dimensional maps enable accurate rendering of physical objects in a three-dimensional form. High-definition maps feature detailed information about road features (lane placements, road boundaries) and terrain type (severity of curves, gradient of the road surface). Both are essential for launching advanced ADAS features, ultimately ushering in the era of autonomous driving.
3D maps define how the vehicle moves and help it interpret information from onboard sensors. Since most sensors have a limited range, HD maps assist by providing the navigation system with extra information on road features, terrain, and other traffic-relevant objects.
Collecting and rendering data is a bottleneck of both HD and 3D mapping. With 3D maps, video must be captured in real-time from multiple cameras, factoring interference due to vibration, temperature, and hardware issues. The process is repeated across billions of kilometers of roads across the globe. Rather than doing this huge task alone, mobility players and OEMs are joining forces:
- HERE and Mobileye partnered to crowdsource HD mapping data collection, with Volkswagen joining later. Mobileye developed a compact, high-performance computer vision system-on-chip called EyeQ. Installed by more than 50 OEMs across 300 vehicle models, the system supplies Mobileye with ample visual data they can render into maps with the help of partners.
- TomTom teamed with Qualcomm Technologies to crowdsource HD mapping insights from its users. Qualcomm provides the underlying cloud-based platform for making and maintaining HD maps from various sources, including swarms of connected vehicles.
4. Autonomous driving simulations
Autonomous vehicles need extensive testing to ensure safety, including simulations of near-crash events without real-world risk. Hyper-realistic virtual environments are safer and more efficient for training, especially as simulation technology advances.
Researchers have developed an open-source engine to create photorealistic environments, simulating sensors like 2D RGB cameras and 3D Lidar (light detection and ranging) and generating dynamic scenarios. It allows complex driving tasks, such as overtaking, to be simulated.
Waymo uses a similar approach, creating detailed virtual replicas of intersections based on real-world data. Waymo algorithms train through repeated simulations, adjusting conditions like vehicle speed, traffic light timing, and unpredictable elements (e.g., cyclists). This training builds a knowledge base shared across all vehicles.
High-fidelity 3D environments use data from diverse sensors to convey detailed real-world elements, with existing databases enhancing realism. Machine learning further refines ADAS/AD scenarios, closely simulating real-life conditions.
5. AR in HUD navigation products
New vehicles come with an upgraded HMI design featuring new hardware and software elements that allow for augmented reality navigation. AR in head-up displays can deliver all standard information from static displays (driving speed, status of the ADAS system, fuel or charge levels), alongside dynamic routing instructions including information on traffic signs, speed limits, construction work alerts, and ETAs.
AR navigation can help drivers make smarter decisions on the road. A recent comparative study found that drivers using AR-augmented HUDs made fewer errors and drove faster on average than those using conventional HUDs. Participants also rated AR HUD instructions as more useful and easier to understand.
The next advance in navigation will be holographic displays, offering AR instructions in 3D. Advances in Lidar technologies already allow for projecting real-time ultra-HD holographic representations of road objects into the driver’s field of view. Such systems can enable shorter obstacle visualization times and reduce driving-related stress, according to Tech Explore.