Sparse representation-based robust face recognition by graph regularized low-rank sparse representation recovery

Discriminative model Robustness Standard test image Neural coding
DOI: 10.1016/j.neucom.2015.02.067 Publication Date: 2015-03-11T01:40:00Z
ABSTRACT
This paper proposes a graph regularized low-rank sparse representation recovery (GLRSRR) method for sparse representation-based robust face recognition, in which both the training and test samples might be corrupted because of illumination variations, pose changes, and occlusions. On the one hand, GLRSRR imposes both the lowest-rank and sparsest constraints on the representation matrix of the training samples, which makes the recovered clean training samples discriminative while maintaining the global structure of data. Simultaneously, GLRSRR explicitly encodes the local structure information of data and the discriminative information of different classes by incorporating a graph regularization term, which further improves the discriminative ability of the recovered clean training samples for sparse representation. As a result, a test sample is compactly represented by more clean training samples from the correct class. On the other hand, since the error matrix obtained by GLRSRR can accurately and intuitively characterize the corruption and occlusion of face image, it can be used as occlusion dictionary for sparse representation. This will bring more accurate representations of the corrupted test samples. The experimental results on several benchmark face image databases manifest the effectiveness and robustness of GLRSRR. A graph regularized low-rank sparse representation recovery method is proposed.The recovered clean training samples have more discriminative ability.The obtained errors can accurately depict corruption and occlusion of face image.A corrupted test sample is encoded by more training samples from the correct class.Our method improves the performance of sparse representation-base classification.
SUPPLEMENTAL MATERIAL
Coming soon ....
REFERENCES (36)
CITATIONS (28)