GM-PHD Filter Based Sensor Data Fusion for Automotive Frontal Perception System | IEEE Journals & Magazine | IEEE Xplore

GM-PHD Filter Based Sensor Data Fusion for Automotive Frontal Perception System


Abstract:

Advanced driver assistance systems and highly automated driving functions require an enhanced frontal perception system. The requirements of a frontal environment percept...Show More

Abstract:

Advanced driver assistance systems and highly automated driving functions require an enhanced frontal perception system. The requirements of a frontal environment perception system cannot be satisfied by either of the existing automotive sensors. A commonly used sensor cluster for these functions consists of a mono-vision smart camera and automotive radar. The sensor fusion is intended to combine the data of these sensors to perform a robust environment perception. Multi-object tracking algorithms have a suitable software architecture for sensor data fusion. Several multi-object tracking algorithms, such as JPDAF or MHT, have good tracking performance; however, the computational requirements of these algorithms are significant according to their combinatorial complexity. The GM-PHD filter is a straightforward algorithm with favorable runtime characteristics that can track an unknown and time-varying number of objects. However, the conventional GM-PHD filter has a poor performance in object cardinality estimation. This paper proposes a method that extends the GM-PHD filter with an object birth model that relies on the sensor detections and a robust object extraction module, including Bayesian estimation of objects’ existence probability to compensate for drawbacks of the conventional algorithm.
Published in: IEEE Transactions on Vehicular Technology ( Volume: 71, Issue: 7, July 2022)
Page(s): 7215 - 7229
Date of Publication: 28 April 2022

ISSN Information:

Funding Agency:


References

References is not available for this document.