Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction | SpringerLink
Skip to main content

Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction

  • Conference paper
Advances in Computation and Intelligence (ISICA 2009)

Part of the book series: Lecture Notes in Computer Science ((LNTCS,volume 5821))

Included in the following conference series:

  • 1448 Accesses

Abstract

Neighborhood selection plays an important role in manifold learning algorithm. This paper proposes an Adaptive Neighborhood Select algorithm based on Local Linearity(ANSLL). Given manifold dimensionality d as a priori knowledge, ANSLL algorithm constructs neighborhood based on two principles: 1. data points in the same neighborhood should approximately lie on a d-dimensional linear subspace; 2. each neighborhood should be as large as possible. And in ASNLL algorithm PCA technique is exploited to measure the linearity of finite data points set. Moreover, we present an improved method of constructing neighborhood graph, which can improve the accuracy of geodesic distance estimate for isometric embedding. Experiments on both synthetic data sets and real data sets show that ANSLL algorithm can adaptively construct neighborhood according to local curvature of data manifold and then improve the performance of most manifold algorithms, such as ISOMAP and LLE.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Institutional subscriptions

Preview

Unable to display preview. Download preview PDF.

Unable to display preview. Download preview PDF.

Similar content being viewed by others

References

  1. Jolliffe, I.T.: Principal Component Analysis. Springer, Heidelberg (1989)

    MATH  Google Scholar 

  2. Cox, T., Cox, M.: Multidimensional Scaling. Chapman and Hall, Boca Raton (1994)

    MATH  Google Scholar 

  3. Tenenbaum, J.B., Silva, V.d., Langford, J.C.: A Global Geometric Framework for Nonlinear Dimensionality Reduction. Science 290, 2319–2323 (2000)

    Article  Google Scholar 

  4. Saul, L.K., Roweis, S.T.: Think globally, fit locally: unsupervised learning of low dimensional manifolds. Journal of Machine Learning Research 4, 119–155 (2003)

    MathSciNet  MATH  Google Scholar 

  5. Roweis, S.T., Saul, L.K.: Nonlinear Dimensionality Reduction by Locally Linear Embedding. Science 290, 2323–2326 (2000)

    Article  Google Scholar 

  6. Belkin, M., Niyogi, P.: Laplacian eigenmaps for dimensionality reduction and data representation. Neural Comput. 15, 1373–1396 (2003)

    Article  MATH  Google Scholar 

  7. Zhang, Z., Zha, H.: Principal Manifolds and Nonlinear Dimension Reduction via Local Tangent Space Alignment. SIAM J. Scientific Computing 26, 313–338 (2005)

    Article  MATH  Google Scholar 

  8. Balasubramanian, M., Shwartz, E.L., Tenenbaum, J.B., Silva, V.d., Langford, J.C.: The Isomap Algorithm and Topological Stability. Science 295 (2002)

    Google Scholar 

  9. Yang, L.: Building k-edge-connected neighborhood graph for distance-based data projection. Pattern Recognit. Lett. 26, 2015–2021 (2005)

    Article  Google Scholar 

  10. Yang, L.: Building k-connected neighborhood graphs for isometric data embedding. IEEE Transactions on Pattern Analysis and Machine Intelligence 28, 827–831 (2006)

    Article  Google Scholar 

  11. Yang, L.: Building connected neighborhood graphs for isometric data embedding. In: Proceedings of the eleventh ACM SIGKDD international conference on Knowledge discovery in data mining. ACM, New York (2005)

    Google Scholar 

  12. Yang, L.: Building Connected Neighborhood Graphs for Locally Linear Embedding. In: ICPR 2006. 18th International Conference on Pattern Recognition,2006, vol. 4, pp. 194–197 (2006)

    Google Scholar 

  13. Samko, O., Marshall, A.D., Rosin, P.L.: Selection of the optimal parameter value for the Isomap algorithm. Pattern Recognit. Lett. 27, 968–979 (2006)

    Article  Google Scholar 

  14. Xia, T., Li, J., Zhang, Y., Tang, S.: A More Topologically Stable Locally Linear Embedding Algorithm Based on R*-Tree. Advances in Knowledge Discovery and Data Mining, 803–812 (2008)

    Google Scholar 

  15. Shao, C., Huang, H., Zhao, L.: A More Topologically Stable ISOMAP Algorithm. Journal of Software 18, 869–877 (2007)

    Article  MATH  Google Scholar 

  16. Shao, C., Huang, H., Wan, C.: Selection of the Suitable Neighborhood Size for the ISOMAP Algorithm. In: IJCNN 2007. International Joint Conference on Neural Networks, pp. 300–305 (2007)

    Google Scholar 

  17. Wen, G., Jiang, L., Wen, J.: Using locally estimated geodesic distance to optimize neighborhood graph for isometric data embedding. Pattern Recognit. 41, 2226–2236 (2008)

    Article  MATH  Google Scholar 

  18. Lin, T., Zha, H.: Riemannian Manifold Learning. IEEE Trans. Pattern Anal. Mach. Intell. 30, 796–809 (2008)

    Article  Google Scholar 

  19. Yan, S., Tang, X.: Largest-eigenvalue-theory for incremental principal component analysis. In: IEEE International Conference on Image Processing, 2005, vol. 1 (2005)

    Google Scholar 

  20. Zhong, C., Miao, D.: A comment on Using locally estimated geodesic distance to optimize neighborhood graph for isometric data embedding. Pattern Recognit. 42, 1012–1013 (2009)

    Article  MATH  Google Scholar 

  21. Wen, G., Jiang, L., Wen, J.: Authors response to A comment on Using locally estimated geodesic distance to optimize neighborhood graph for isometric data embedding. Pattern Recognit. 42, 1014 (2009)

    Article  MATH  Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2009 Springer-Verlag Berlin Heidelberg

About this paper

Cite this paper

Zhan, Y., Yin, J., Liu, X., Zhang, G. (2009). Adaptive Neighborhood Select Based on Local Linearity for Nonlinear Dimensionality Reduction. In: Cai, Z., Li, Z., Kang, Z., Liu, Y. (eds) Advances in Computation and Intelligence. ISICA 2009. Lecture Notes in Computer Science, vol 5821. Springer, Berlin, Heidelberg. https://doi.org/10.1007/978-3-642-04843-2_36

Download citation

  • DOI: https://doi.org/10.1007/978-3-642-04843-2_36

  • Publisher Name: Springer, Berlin, Heidelberg

  • Print ISBN: 978-3-642-04842-5

  • Online ISBN: 978-3-642-04843-2

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics