Abstract
We propose a new method for discriminant analysis, called High Dimensional Discriminant Analysis (HDDA). Our approach is based on the assumption that high dimensional data live in different subspaces with low dimensionality. We therefore propose a new parameterization of the Gaussian model to classify high-dimensional data. This parameterization takes into account the specific subspace and the intrinsic dimension of each class to limit the number of parameters to estimate. HDDA is applied to recognize object parts in real images and its performance is compared to classical methods.
Preview
Unable to display preview. Download preview PDF.
Similar content being viewed by others
References
Bellman, R.: Dynamic Programing. Princeton University Press, Princeton (1957)
Bensmail, H., Celeux, G.: Regularized Gaussian Discriminant Analysis Through Eigenvalue Decomposition. Journal of the American Statistical Association 91, 1743–1748 (1996)
Catell, R.B.: The scree test for the number of factors. Multivariate Behavioral Research 1, 140–161 (1966)
Fergus, R., Perona, P., Zisserman, A.: Object Class Recognition by Unsupervised Scale-Invariant Learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 264–271 (2003)
Flury, B.W.: Common Principal Components in k groups. Journal of the American Statistical Association 79, 892–897 (1984)
Friedman, J.H.: Regularized Discriminant Analysis. Journal of the American Statistical Association 84, 165–175 (1989)
Lowe, D.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)
Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)
Scott, D., Thompson, J.: Probability density estimation in higher dimensions. In: Proceedings of the Fifteenth Symposium on the Interface, pp. 173–179. North Holland-Elsevier Science Publishers (1983)
Author information
Authors and Affiliations
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2006 Springer-Verlag Berlin Heidelberg
About this paper
Cite this paper
Bouveyron, C., Girard, S., Schmid, C. (2006). Class-Specific Subspace Discriminant Analysis for High-Dimensional Data. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_9
Download citation
DOI: https://doi.org/10.1007/11752790_9
Publisher Name: Springer, Berlin, Heidelberg
Print ISBN: 978-3-540-34137-6
Online ISBN: 978-3-540-34138-3
eBook Packages: Computer ScienceComputer Science (R0)