Class-Specific Subspace Discriminant Analysis for High-Dimensional Data | SpringerLink
Skip to main content

Class-Specific Subspace Discriminant Analysis for High-Dimensional Data

  • Conference paper
Subspace, Latent Structure and Feature Selection (SLSFS 2005)

Abstract

We propose a new method for discriminant analysis, called High Dimensional Discriminant Analysis (HDDA). Our approach is based on the assumption that high dimensional data live in different subspaces with low dimensionality. We therefore propose a new parameterization of the Gaussian model to classify high-dimensional data. This parameterization takes into account the specific subspace and the intrinsic dimension of each class to limit the number of parameters to estimate. HDDA is applied to recognize object parts in real images and its performance is compared to classical methods.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Bellman, R.: Dynamic Programing. Princeton University Press, Princeton (1957)

    Google Scholar 

  2. Bensmail, H., Celeux, G.: Regularized Gaussian Discriminant Analysis Through Eigenvalue Decomposition. Journal of the American Statistical Association 91, 1743–1748 (1996)

    Article  MathSciNet  MATH  Google Scholar 

  3. Catell, R.B.: The scree test for the number of factors. Multivariate Behavioral Research 1, 140–161 (1966)

    Google Scholar 

  4. Fergus, R., Perona, P., Zisserman, A.: Object Class Recognition by Unsupervised Scale-Invariant Learning. In: IEEE Conference on Computer Vision and Pattern Recognition, pp. 264–271 (2003)

    Google Scholar 

  5. Flury, B.W.: Common Principal Components in k groups. Journal of the American Statistical Association 79, 892–897 (1984)

    MathSciNet  Google Scholar 

  6. Friedman, J.H.: Regularized Discriminant Analysis. Journal of the American Statistical Association 84, 165–175 (1989)

    Article  MathSciNet  Google Scholar 

  7. Lowe, D.: Distinctive image features from scale-invariant keypoints. International Journal of Computer Vision 60(2), 91–110 (2004)

    Article  Google Scholar 

  8. Schwarz, G.: Estimating the dimension of a model. The Annals of Statistics 6, 461–464 (1978)

    Article  MathSciNet  MATH  Google Scholar 

  9. Scott, D., Thompson, J.: Probability density estimation in higher dimensions. In: Proceedings of the Fifteenth Symposium on the Interface, pp. 173–179. North Holland-Elsevier Science Publishers (1983)

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2006 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Bouveyron, C., Girard, S., Schmid, C. (2006). Class-Specific Subspace Discriminant Analysis for High-Dimensional Data. In: Saunders, C., Grobelnik, M., Gunn, S., Shawe-Taylor, J. (eds) Subspace, Latent Structure and Feature Selection. SLSFS 2005. Lecture Notes in Computer Science, vol 3940. Springer, Berlin, Heidelberg. https://doi.org/10.1007/11752790_9

Download citation

  • DOI: https://doi.org/10.1007/11752790_9

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-540-34137-6

  • Online ISBN: 978-3-540-34138-3

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics