Skip to Main Content
Driver assistance functionalities on the market are getting more and more sophisticated, which will lead to integrated systems that fuse the data of multiple sensors (e.g., camera, Photonic Mixer Device, Radar) and internal system percepts (e.g., detected objects and their states, detected road). One important future challenge will be to find smart solutions in system design that allow an efficient control of said integrated systems. A promising way for achieving this is to get inspiration from known signal-processing principles in the human brain. The contribution presents a biologically motivated Advanced Driver Assistance System (ADAS) that uses the generic principle of attention as common front-end of all visual processes. Based on the attention principle an early task-dependent pre-selection of interesting image regions is done, which decreases the scene complexity. Furthermore, internal information fusion increases the system performance (e.g., the attention is used to improve the object tracking, road-detection results can improve the attention). Based on streams of a challenging traffic scenario it is shown, how the system builds up and verifies its environment-related expectations relying on the attention principle. The ADAS is controlled by a central behavior control module that tunes sub-modules and parameters. The behavior control module has a simple structure, but still allows for robustly performing various tasks, since the complexity is distributed over the system in form of local control loops mimicking human cognition aspects.