The non-reproducibility of the environment and accompanying difficulty in reconstructing scenarios is one of the greatest challenges of endurance testing. Unfavourable weather conditions in particular, such as rain, fog and the position of the sun, can lead to critical sensor effects and to incorrect decisions subsequently being taken by an automated system. The safeguarding of safety systems draws on the aforementioned mixed reality test environment, enabling testing both in real life under reproducible conditions and virtually in simulation environments.
Statistical and dynamic tests can be made at variable rain intensities using an indoor rain unit developed in-house and validated with measurements taken under natural conditions. The use of a test hall for testing purposes ensures stable general conditions. In addition, by employing disturbance models, it is possible to perform purely virtual testing of algorithms to determine weather effects. A considerable increase in a system’s robustness can be effected by conducting tests at an early stage.
Sensor simulation with a focus on weather conditions
Projects
The subproject develops a validation method for sensor models based on the measurement of real weather conditions and a downstream disturbance modelling. Weather conditions (bad weather situations rain, fog, snow, but also low sun position) have a considerable influence on the reliability of the environment sensor system in the vehicle and are systematically investigated within this subproject. For this purpose, real weather data will be measured over a longer period of time in reality, a rain and fog simulator will be set up in the laboratory (each 50m long), and the simulators will be parameterized with real weather values. In this setting, environment sensors in the vehicle (radar, lidar, camera) are exposed to different weather conditions under reproducible conditions, measured data are recorded and transferred into models for the virtual world (simulation). Finally, the models are subjected to a comparative evaluation (validation).
The automotive industry is facing one of the most demanding challenges in its history: how to make automated travel safe in all conditions. There have been great advances towards automation with new vehicles increasingly equipped with driver assistance systems (ADAS). The biggest barrier now remaining to full automation is driving under poor weather and visibility in safety. The AI-SEE project aims to build a novel, robust sensing system supported by Artifical Intelligence (AI) that will enable automated travel in varied traffic, lighting and weather conditions. It will extend the Operational Design Domain (ODD) of automated vehicles (i.e. the scope of what they can do), taking the technology from SAE level 3 (environmental detection, human override) to level 4 (high driving automation) where vehicles drive themselves with no human interaction in most circumstances.
With advanced and autonomous vehicles entering the market, solving problems linked to lighting and weather conditions such as rain, fog and snow is key to ensuring a safe environment for drivers, passengers and pedestrians. However, to move from level 3 to level 4 requires solutions to four key challenges: (i) mass-production of powerful computing platforms (ii) improved sensing capabilities and lower-cost sensors (iii) necessary technical standards and (iv) infrastructure. AI-SEE is focusing primarily on the second challenge by increasing the environmental and situational awareness of vehicles.
Humans ‘see’ by combining stored memories and sensory input to interpret events and anticipate upcoming scenarios. Today’s automated vehicles cannot yet provide this inferential thinking, nor communicate in real-time with the environment. For automated vehicles to drive without human intervention, the information content from current sensors needs to be enhanced significantly. But this will create an increasingly large amount of data transmitted at huge data rates which, along with all the additonal sensors, will quickly exceed the limits of in-vehicle storage space, and vehicle computational and energy resources.
Together, the high number of sensors needed for 360 degree environment perception and situation awareness and the high cost of LiDAR (Light Detection & Ranging) used for measuring distances to objects, represent significant barriers to the wider roll out of automated driving.[1]
Taking technologies to the next level
AI-SEE will address these challenges by combining complex hardware and software development, creating automotive perception systems that go beyond today’s state-of-the-art. Its goal is to introduce reliable, secure, trustable sensors and software by implementing self-diagnosis, adaptation and robustness.
The AI-SEE concept is built on four main blocks:
- A 24/365 high resolution adaptive all-weather sensor suite
- An AI platform for predictive detection of prevailing environmental conditions including signal enhancement and sensor adaptation
- Smart sensor data fusion to create the 24/365 adaptive all-weather robust perception system
- A demonstrator and system validation plan, with testing carried out in simulations and in real-world environments in northern Europe
The project will deliver the first high-resolution adaptive multi-sensor suite building on ainnovative novel AI perception-processing scheme for low visibility conditions.
Specifically, AI-SEE will create novel sensor hardware comprising an active polarimetric imager and congruent LIDAR data; a short-wave infrared (SWIR) LIDAR with a novel SPAD receiver architecture; a high resolution 4D MIMO radar and a gated SWIR-camera. To support the novel sensing system and improve localization performance in poor weather, the project will also take high definition (HD) dynamic mapping to a new level. In addition, to handle the multisensory data fusion, an AI platform will be built to advance early signal enhancement for robust perception.
Importantly, the project will develop sensor-near simulation models for all active sensors for artificial generation of synthetic inclement weather datasets. This is expected to revolutionise simulation, with conversion of good weather neural network datasets into inclement weather datasets – thereby saving large amounts of money and time in testing and validating inclement weather sensor performance. Moreover, a large outdoor weather data bank for testing, modelling and validation will also be created. All of which will lead to a paradigm shift in signal enhancement techniques and a competitive advantage for the European automotive industry.
Partnership beyond traditional automotive networks
The AI-SEE fault tolerant environment perception system and its sub-system are highly complex. Bringing them to the market calls for partnership beyond the traditional network of automotive OEMs and Tier 1 suppliers. To tackle the challenges of new hardware, data collection, and AI-supported signal enhancement and simulations, the AI-SEE project includes OEMs; Tier 1, 2 and 3 suppliers; and smaller engineering companies as well as academic and research institutes. Together, these partners not only have the necessary expertise, but also unique testing and development capabilities not found in any one European country alone.
Positioned for rapidly evolving market opportunities
The emergence of AD is transforming the automotive industry, bringing in new players such as online service providers and IT and telecommunication suppliers. However, the market outlook remains unclear. Predictions range from sales of two million L3+ vehicles in 2030 to 63 million in the same time period[1] . Nonetheless, automotive sensor sales are expected to grow at an average rate of 8%, with 14% growth in sales value, up to 2022. Overall, the sensor market was worth USD 11 billion in 2016 and it is expected to reach USD 23 billion by 2022, mainly due to the boom in imaging, radar and LiDAR sensors, which will respectively be worth USD 7.7 billion, USD 6.2 billion and USD 1.4 billion by 2022.[2]
Being fast and focused will be key for success in this rapidly evolving landscape. AI-SEE will contribute by giving Europe a vital first in cutting-edge technologies for environmental perception. It will also allow European companies to compete in the supply of sensors related to adverse weather conditions where US exports are limited by security (defence technology) concerns. Furthermore, it will help Europe maintain its strong position (close to 40%) in the fast-growing LIDAR market, expected to reach USD 2.6 billion by 2030,[3] and to seize opportunities in areas such as automotive night vision systems and automotive radar. In addition, AI-SEE’s results will support standardisation, which is essential for deployment of automated driving systems on a wide scale.
Automotive is one of Europe’s key industries accounting for 6.8% of EU GDP and 13.3 million jobs. Through its outcomes, AI-SEE will help Europe to retain its world-leading strengths in this domain, thereby safeguarding high-value jobs, economic growth and indirectly the social well-being of EU citizens.
KEY APPLICATION AREAS
Transport and Smart Mobility
ESSENTIAL CAPABILITIES
Safety, Security and Reliability;
PARTNERS
Mercedes-Benz AG
Algolux (Germany) GmbH
Algolux Inc.
ams AG
ANSYS Germany GmbH,
AstaZero AB
AVL List GmbH
Basemark Oy
Brightway Vision Ltd.
FIFTY2 Technology GmbH
Ibeo Automotive Systems GmbH
Institut für Halbleitertechnik der Universität Stuttgart
Institut für Lasertechnologien in der Medizin und Meßtechnik an der Universität Ulm
Meluta Oy
Mercedes-Benz AG (Project Lead)
OQmented GmbH
Patria Land Oy
Robert Bosch GmbH
Technische Hochschule Ingolstadt CARISSMA Institute of Automated Driving
UNIKIE Oy
Veoneer Sweden AB
VTT Technical Research Centre of Finland Ltd
COUNTRIES INVOLVED
Austria
Canada
Finland
Germany
Israel
Sweden
PROJECT LEADERS
Name: Dr. Werner Ritter
Company: Mercedes-Benz AG
KEY PROJECT DATES
Start: 01 June 2021
End: 01 April 2024
[1] Prices for individual LiDAR-sensors can reach up to 10 000 €. Garmin, 2019. LIDAR-Lite v3HP a Low-Cost Solution to Autonomous Building-Interior Mapping.
[2] Electronic Specifier (2020). Sensing changes in the automotive sensor market.
[3] KnowMade (2018). LiDAR for Automotive Patent Landscape.
ROADVIEW integrates a complex in-vehicle system-of-systems able to perform advanced environment and traffic recognition and prediction and determine the appropriate course of action of a CAV in a real-world environment, including harsh weather conditions. The project develops an embedded in-vehicle perception and decision-making system based on enhanced sensing, localization, and improved object/person classification (including vulnerable road users). Its ground-breaking innovations are grounded on a cost-effective multisensory setup, sensor noise modelling and filtering, collaborative perception, testing by simulation-assisted methods and integration and demonstration under different scenarios and weather conditions.
ROADVIEW implements the co-programmed European Partnership “Connected, Cooperative and Automated Mobility” (CCAM) partnership by contributing to the development of more powerful, fail-safe, resilient and weather-aware technologies. The consortium is a perfect combination of leading universities in the field and research institutes, high-tech SMEs, and strong industry leaders. Beyond their research excellence, the consortium members bring a unique portfolio of testing sites and testing infrastructure, ranging from hardware-testing facilities and rain and wind tunnels to test tracks north of the Arctic Circle.
CARISSMA leads, in this project, the development and validation of a weather aware X-in-the-loop test environment. The X-in-the-loop environment under development in this project combines simulation and reality for testing the automated driving system under extreme weather conditions, including rain, fog and snow. Other 15 partners integrate the ROADVIEW consortium, which is coordinated by the Halmstad University from Sweden. More information about the project could be found under roadview-project.eu . The project is funded by the European Union (grant no. 101069576) and supported by UK Research and Innovation and the Swiss State Secretariat for Education, Research and Innovation (SERI).