Towards Robust Vehicular Context Sensing | IEEE Journals & Magazine | IEEE Xplore

Towards Robust Vehicular Context Sensing


Abstract:

In-vehicle context sensing can detect many aspects of driver behavior and the environment, such as drivers changing lanes, potholes, road grade, and stop signs, and these...Show More

Abstract:

In-vehicle context sensing can detect many aspects of driver behavior and the environment, such as drivers changing lanes, potholes, road grade, and stop signs, and these features can be used to improve driver safety and comfort, and engine efficiency. In general, detecting these features can use either onboard sensors on the vehicle (car sensors) or sensors built into mobile devices (phone sensors) carried by one or more occupants, or both. Furthermore, traces of sensor readings from different cars, when crowd-sourced, can provide increased spatial coverage as well as disambiguation. In this paper, we explore, by designing novel detection algorithms for the four different features discussed above, three related questions: How is the accuracy of detection related to the choice of phone versus car sensors? To what extent, and in what ways, does crowd-sourcing contribute to detection accuracy? How is accuracy affected by phone position? We have collected hundreds of miles of vehicle traces with annotated groundtruth, and demonstrated through evaluation that our detection algorithms can achieve high accuracy for each task (e.g., > 90% for lane change determinations) and that crowd-sensing plays an indispensable role in improving the detection performance (e.g., improving recall by 35% for lane change determinations on curves). Our results can give car manufacturers insight into how to augment their internal sensing capabilities with phone sensors, or give mobile app developers insight into what car sensors to use in order to complement mobile device sensing capabilities.
Published in: IEEE Transactions on Vehicular Technology ( Volume: 67, Issue: 3, March 2018)
Page(s): 1909 - 1922
Date of Publication: 09 November 2017

ISSN Information:

Funding Agency:


Contact IEEE to Subscribe

References

References is not available for this document.