Abstract:
Indoor fall monitoring is challenging for community-dwelling older adults due to the need for high accuracy and privacy concerns. Doppler radar is promising, given its lo...Show MoreMetadata
Abstract:
Indoor fall monitoring is challenging for community-dwelling older adults due to the need for high accuracy and privacy concerns. Doppler radar is promising, given its low-cost and contactless sensing mechanism. However, the line-of-sight restriction limits the application of radar sensing in practice, as the Doppler signature will vary when the sensing angle changes, and signal strength will substantially degrade with large aspect angles. Additionally, the similarity of the Doppler signatures among different fall types makes classification challenging. To address these problems, we first present an experimental study to obtain Doppler signals under large and arbitrary aspect angles for diverse types of simulated activities. We then develop a novel, explainable, multi-stream, feature-resonated neural network (eMSFRNet) that achieves fall detection and a pioneering study of classifying seven fall types. eMSFRNet is robust to radar sensing angles and subjects, and is the first method that can resonate and enhance feature information from noisy/weak Doppler signatures. The multiple feature extractors - from ResNet, DenseNet, and VGGNet - extract diverse feature information with various spatial abstractions from a pair of Doppler signals. The resonated-fusion translates the multi-stream features to a single salient feature that is critical to fall detection and classification. eMSFRNet achieved 99.3% accuracy detecting falls and 76.8% accuracy classifying seven fall types. Our work is the firstmultistatic robust sensing system that overcomes the challenges associated with Doppler signatures under large and arbitrary aspect angles. Our work also demonstrates the potential to accommodate radar monitoring tasks that demand precise and robust sensing.
Published in: IEEE Journal of Biomedical and Health Informatics ( Volume: 27, Issue: 4, April 2023)