DAnet: Depth-Aware Network for Crowd Counting | IEEE Conference Publication | IEEE Xplore

DAnet: Depth-Aware Network for Crowd Counting


Abstract:

Image-based people counting is a challenging work due to the large scale variation problem caused by the diversity of distance between the camera and the person, especial...Show More

Abstract:

Image-based people counting is a challenging work due to the large scale variation problem caused by the diversity of distance between the camera and the person, especially in the congested scenes. To handle this problem, the previous methods focus on building complicated models and rely on labeling the sophisticated density maps to learn the scale variation implicitly. It is often time-consuming in data pre-processing and difficult to train these deep models due to the lack of training data. In this paper, we thus propose an alternative and novel way for crowd counting which handles the scale variation problem by leveraging the auxiliary depth estimation dataset. Using separated crowd and depth datasets, we train a unified network for two tasks- crowd density map estimation and depth estimation- at the same time. By introducing the auxiliary depth estimation task, we prove that the scale problem caused by distance can be well solved and the labeling cost can be reduced. The efficacy of our method is demonstrated in the extensive experiments by multiple evaluation criteria.
Date of Conference: 22-25 September 2019
Date Added to IEEE Xplore: 26 August 2019
ISBN Information:

ISSN Information:

Conference Location: Taipei, Taiwan

1. INTRODUCTION

In recent years, due to the rapid growth in the urban population, the demand for managing public safety has become essential. Accurate crowd counting and density distribution estimation thus play a crucial role in various scenarios, e.g. music festival, sports event, street parade. However, this task remains challenging, especially in the face of the scale variation problem. The root of this problem is shown intuitively in Fig. 1.

Contact IEEE to Subscribe

References

References is not available for this document.