From a controlled, scenario-based test model to a self-driving car
Whereas humans use their senses to recognize their surroundings, cars rely on their sensors. Future generations of vehicles will increasingly tap into artificial intelligence to interpret sensor data. This will add a new and more demanding dimension to testing vehicles, not only when it comes to obtaining the data required to train systems, but also with respect to scenario-based testing. To meet the technology and process challenges this presents, and to solve problems efficiently, experts at Steinbeis Interagierende Systeme have developed the AR Car, a fully automated test vehicle that is integrated into the development cloud.
Vehicles can find themselves in an infinite number of situations while out and about on the roads. To ensure they not only perceive their surroundings but also “comprehend” them, an increasing number of AI systems are being introduced to vehicle software. These need training and to do that, you need representative data. For example, if you want a vehicle to recognize signs in the future, first you need to train its software by showing it signs from different angles, under different lighting conditions, or in different colors and sizes.
Moving from data to testing
In turn, to validate vehicle software you have to expose vehicles to a variety of situations and see if they behave the way they are expected to. This means that the development process for vehicles, which are expected to move around autonomously within their surroundings, starts by gathering data from representative scenarios and ends by systematically testing processes on the target system, i.e. the vehicle. This involves alternating between tasks carried out by humans and those based in the cloud, and this also has to be processed and evaluated.
To analyze this working method and, in particular, to ensure steps within the process can be experienced in a compact form, Steinbeis Interagierende Systeme uses a CI setup. This includes automated unit testing, a software-in-the-loop solution for vehicles, and a 1:8-scale, fully automatable test vehicle. The CI setup provides a basis for the SensorTwin project launched in July 2021 (see also page 70), which is being funded as part of the Baden-Wuerttemberg AI Innovation Competition. The aim is to find ways to optimize the meaning and usefulness of the cloud-based “test factory” in relation to real vehicles in real surroundings.
Sponsored by the
Baden-Württemberg MINISTRY OF ECONOMIC AFFAIRS, LABOR, AND TOURISM
Contact
Daniel Elsenhans (author)
Systems Engineer
Steinbeis Interagierende Systeme GmbH (Herrenberg)
www.interagierende-systeme.de/sensor-twin