Abstract
A relaxed two-dimensional principal component analysis (R2DPCA) approach is proposed for face recognition. Different to the 2DPCA, 2DPCA-L1 and G2DPCA, the R2DPCA utilizes the label information (if known) of training samples to calculate a relaxation vector and presents a weight to each subset of training data. A new relaxed scatter matrix is defined and the computed projection axes are able to increase the accuracy of face recognition. The optimal Lp-norms are selected in a reasonable range. Numerical experiments on practical face databased indicate that the R2DPCA has high generalization ability and can achieve a higher recognition rate than state-of-the-art methods.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Jolliffe, I.: Principal Component Analysis. Springer, New York (2004)
Turk, M., Pentland, A.: Eigenfaces for recognition. J. Cogn. Neurosci. 3(1), 71–86 (1991)
Sirovich, L., Kirby, M.: Low-dimensional procedure for characterization of human faces. J. Opt. Soc. Am. 4, 519–524 (1987)
Kirby, M., Sirovich, L.: Application of the Karhunen-Loeve procedure for the characterization of human faces. IEEE Trans. Pattern Anal. Mach. Intell. 12(1), 103–108 (1990)
Zhao, L., Yang, Y.: Theoretical analysis of illumination in PCA-based vision systems. Pattern Recogn. 32(4), 547–564 (1999)
Pentland, A.: Looking at people: sensing for ubiquitous and wearable computing. IEEE Trans. Pattern Anal. Mach. Intell. 22(1), 107–119 (2000)
Ke, Q., Kanade, T.: Robust L1 norm factorization in the presence of outliters and missing data by alternative convex programming. In: Proceedings IEEE Conference Computer Vision Pattern Recognition, vol. 1, pp. 739–746, San Diego, CA, USA (2005)
Ding, C., Zhou, D., He, X., Zha, H.: R1-PCA: rotational invariant L1-norm principal component analysis for robust subspace factorization. In: Proceedings 23rd International Conference Machine Learning, pp. 281–288, Pittsburgh, PA, USA (2006)
Kwak, N.: Principal component analysis based on L1-norm maximization. IEEE Trans. Pattern Anal. Mach. Intell. 30(9), 1672–1680 (2008)
Zou, H., Hastie, T., Tibshirani, R.: Sparse principal component analysis. J. Comput. Graph. Stat. 15(2), 265–286 (2006)
d’Aspremont, A., EI Ghaoui, L., Jordan, M.I., Lanckriet, G.R.: A direct formulation for sparse PCA using semidefinite programming. SIAM Rev. 49(3), 434–448 (2007)
Shen, H., Huang, J.Z.: Sparse principal component analysis via regularized low rank matrix approximation. J. Multivar. Anal. 99(6), 1015–1034 (2008)
Witten, D.M., Tibshirani, R., Hastie, T.: A penalized matrix decomposition, with applications to sparse principal components and canonical correlation analysis. Biostatistics 10(3), 515–534 (2009)
Meng, D., Zhao, Q., Xu, Z.: Improve robustness of sparse PCA by L1-norm maximization. Pattern Recogn. 45(1), 487–497 (2012)
Kwak, N.: Principal component analysis by Lp-norm maximization. IEEE Trans. Cybern. 44(5), 594–609 (2014)
Liang, Z., Xia, S., Zhou, Y., Zhang, L., Li, Y.: Feature extraction based on Lp-norm generalized principal component analysis. Pattern Recogn. Lett. 34(9), 1037–1045 (2013)
Yang, J., Zhang, D., Frangi, A.F., Yang, J.Y.: Two-dimensional PCA: a new approach to appearance-based face representation and recognition. IEEE Trans. Pattern Anal. Mach. Intell. 26(1), 131–137 (2004)
Li, X., Pang, Y., Yuan, Y.: L1-norm-based 2DPCA. IEEE Trans. Syst. Man Cybern. B Cybern. 40(4), 1170–1175 (2010)
Wang, H., Wang, J.: 2DPCA with L1-norm for simultaneously robust and sparse modelling. Neural Netw. 46, 190–198 (2013)
Wang, J.: Generalized 2-D principal component analysis by Lp-Norm for image analysis. IEEE Trans. Cybern. 46(3), 792–803 (2016)
Jia, Z.-G., Ling, S.-T., Zhao, M.-X.: Color two-dimensional principal component analysis for face recognition based on quaternion model. In: Huang, D.-S., Bevilacqua, V., Premaratne, P., Gupta, P. (eds.) ICIC 2017. LNCS, vol. 10361, pp. 177–189. Springer, Cham (2017). https://doi.org/10.1007/978-3-319-63309-1_17
Zhao, M., Jia, Z., Gong, D.: Sample-relaxed two-dimensional color principal component analysis for face recognition and image reconstruction. arXiv.org/cs/arXiv:1803.03837v1 (2018)
Jia, Z., Wei, M., Ling, S.: A new structure-preserving method for quaternion Hermitian eigenvalue problems. J. Comput. Appl. Math. 239, 12–24 (2013)
Ma, R., Jia, Z., Bai, Z.: A structure-preserving Jacobi algorithm for quaternion Hermitian eigenvalue problems. Comput. Math Appl. 75(3), 809–820 (2018)
Jia, Z., Ma, R., Zhao, M.: A new structure-preserving method for recognition of color face images. Comput. Sci. Artif. Intell. 427–432 (2017)
Jia, Z., Wei, M., Zhao, M., Chen, Y.: A new real structure-preserving quaternion QR algorithm. J. Comput. Appl. Math. 343, 26–48 (2018)
Jia, Z., Cheng, X., Zhao, M.: A new method for roots of monic quaternionic quadratic polynomial. Comput. Math Appl. 58(9), 1852–1858 (2009)
Jia, Z., Wang, Q., Wei, M.: Procrustes problems for (P, Q, η)-reflexive matrices. J. Comput. Appl. Math. 233(11), 3041–3045 (2010)
Zhao, M., Jia, Z.: Structured least-squares problems and inverse eigenvalue problems for (P, Q)-reflexive matrices. Appl. Math. Comput. 235, 87–93 (2014)
Jia, Z., Ng, M.K., Song, G.: Lanczos method for large-scale quaternion singular value decomposition. Numer. Algorithms (2018). https://doi.org/10.1007/s11075-018-0621-0
Jia, Z., Ng, M.K., Song, G.: Robust quaternion matrix completion with applications to image inpainting. Numer. Linear Algebra Appl. (2019). https://doi.org/10.1002/nla.2245. http://www.math.hkbu.edu.hk/~mng/quaternion.html
Jia, Z., Ng, M.K., Wang, W.: Color image restoration by saturation-value (SV) total variation. SIAM J. Imaging Sci. (2019). http://www.math.hkbu.edu.hk/~mng/publications.html
Mackey, L.: Deflation methods for sparse PCA. Proceedings Advances in Neural Information Processing Systems 21, pp. 1017–1024, Whistler, BC, Canada(2008)
Ye, J.: Characterization of a family of algorithms for generalized discriminant analysis on undersampled problems. Mach. Learn. Res. 6, 483–502 (2005)
Liang, Z.Z., Li, Y.F., Shi, P.F.: A note on two-dimensional linear discriminant analysis. Pattern Recogn. Lett. 29, 2122–2128 (2008)
Chang, Q., Jia, Z.: New fast algorithms for a modified TV-Stokes model. Sci. Sinica Math. 44(12), 1323–1336 (2014). (in Chinese)
Jia, Z., Wei, M.: A new TV-Stokes model for image deblurring and denoising with fast algorithms. J. Sci. Comput. 72(2), 522–541 (2017)
Acknowledgments
This paper is supported in part by National Natural Science Foundation of China under grant 11771188.
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2019 Springer Nature Switzerland AG
About this paper
Cite this paper
Chen, X., Jia, Z., Cai, Y., Zhao, M. (2019). Relaxed 2-D Principal Component Analysis by Lp Norm for Face Recognition. In: Huang, DS., Bevilacqua, V., Premaratne, P. (eds) Intelligent Computing Theories and Application. ICIC 2019. Lecture Notes in Computer Science(), vol 11643. Springer, Cham. https://doi.org/10.1007/978-3-030-26763-6_19
Download citation
DOI: https://doi.org/10.1007/978-3-030-26763-6_19
Published:
Publisher Name: Springer, Cham
Print ISBN: 978-3-030-26762-9
Online ISBN: 978-3-030-26763-6
eBook Packages: Computer ScienceComputer Science (R0)