Loading [MathJax]/extensions/MathMenu.js
Double Graph Regularized Double Dictionary Learning for Image Classification | IEEE Journals & Magazine | IEEE Xplore

Double Graph Regularized Double Dictionary Learning for Image Classification


Abstract:

In this paper, we present a novel double graph regularized double dictionary learning (DGRDDL) method for image classification. The proposed method jointly constructs a n...Show More

Abstract:

In this paper, we present a novel double graph regularized double dictionary learning (DGRDDL) method for image classification. The proposed method jointly constructs a number of class-specific sub-dictionaries to capture the most discriminative features (class-specific information) of each class, and a class-shared dictionary to model the common patterns (class-shared information) shared by the images from different classes. A novel double graph regularization is proposed to correctly represent and differentiate these two types of information. Specifically, an intra-class similarity graph constraint is imposed on the representation coefficients over the class-specific dictionaries, and an inter-class similarity graph constraint is applied on the representation coefficients over the class-shared dictionary. In this way, the representations learned by the proposed DGRDDL method can correctly model the local similarity relationships of the class-specific and the class-shared information in images, respectively. Moreover, due to the differences between the intra-class and inter-class similarity graphs, the two types of information can be appropriately separated and captured by the learned dictionaries. We evaluate the performance of the proposed method on six public datasets and compared against those of seven benchmark methods. The experimental results demonstrate the effectiveness and superiority of the proposed method in image classification over the benchmark dictionary learning methods.
Published in: IEEE Transactions on Image Processing ( Volume: 29)
Page(s): 7707 - 7721
Date of Publication: 29 June 2020

ISSN Information:

Funding Agency:


References

References is not available for this document.