Abstract:
Deep hashing has proven to be efficient and effective for large-scale face retrieval. However, existing hashing methods are designed for normal face images only. They fai...Show MoreMetadata
Abstract:
Deep hashing has proven to be efficient and effective for large-scale face retrieval. However, existing hashing methods are designed for normal face images only. They fail to consider the fact that face images may be occluded because of wearing masks, hats, glasses, etc. Retrieval performance of existing face retrieval methods is much worse when dealing with occluded face images. In this work, we propose the knowledge distillation hashing (KDH) to deal with occluded face images. The KDH is a two-stage learning approach with teacher-student model distillation. We first train a teacher hashing network using normal face images and then the knowledge from teacher model is used to guide the optimization of the student model using occluded face images as input only. With knowledge distillation, we build a connection between imperfect face information and the optimal hash codes. Experimental results show that the KDH yields significant improvements and better retrieval performance in comparison to existing state-of-the-art deep hashing retrieval methods under six different face occlusion situations.
Published in: IEEE Transactions on Multimedia ( Volume: 25)