Abstract:
Biometrics has been increasingly integrated into wearables for enhanced data security in recent years. Meanwhile, wearable popularity offers a unique chance to capture no...Show MoreMetadata
Abstract:
Biometrics has been increasingly integrated into wearables for enhanced data security in recent years. Meanwhile, wearable popularity offers a unique chance to capture novel biometrics via embedded sensors. In this paper, we study new intracorporal biometrics combining the uniqueness of heart motion, bone conduction, and body asymmetry. Specifically, we introduce HeartPrint, a passive yet secure user authentication system exploiting the bone-conducted heart sounds captured by (widely available) dual in-ear microphones (IEMs). To eliminate interference, we devise a novel method combining modified non-negative matrix factorization and adaptive filtering. This extracts clean heart sounds while addressing interference of body sounds and audio produced by the earphones. We further explore the uniqueness of IEM-recorded heart sounds in three aspects to extract a novel biometric representation, based on which HeartPrint leverages a convolutional neural model equipped with a continual learning method to achieve accurate authentication under drifting body conditions. Furthermore, user-friendly registration and energy-effective authentication are facilitated by a data augmentation method using transformer-based GAN and an authentication interval control method. Extensive experiments with 45 participants confirm that HeartPrint can achieve 1.6% FAR and 1.8% FRR, while effectively coping with major attacks, complicated interference, and hardware diversity, while exhibiting robustness in real-world environments.
Published in: IEEE Transactions on Dependable and Secure Computing ( Early Access )