Towards kernelizing the classifier for hyperbolic data | Frontiers of Computer Science Skip to main content
Log in

Towards kernelizing the classifier for hyperbolic data

  • Research Article
  • Published:
Frontiers of Computer Science Aims and scope Submit manuscript

Abstract

Data hierarchy, as a hidden property of data structure, exists in a wide range of machine learning applications. A common practice to classify such hierarchical data is first to encode the data in the Euclidean space, and then train a Euclidean classifier. However, such a paradigm leads to a performance drop due to distortion of data embedding in the Euclidean space. To relieve this issue, hyperbolic geometry is investigated as an alternative space to encode the hierarchical data for its higher ability to capture the hierarchical structures. Those methods cannot explore the full potential of the hyperbolic geometry, in the sense that such methods define the hyperbolic operations in the tangent plane, causing the distortion of data embeddings. In this paper, we develop two novel kernel formulations in the hyperbolic space, with one being positive definite (PD) and another one being indefinite, to solve the classification tasks in hyperbolic space. The PD one is defined via mapping the hyperbolic data to the Drury-Arveson (DA) space, which is a special reproducing kernel Hilbert space (RKHS). To further increase the discrimination of the classifier, an indefinite kernel is further defined in the Kreĭn spaces. Specifically, we design a 2-layer nested indefinite kernel which first maps hyperbolic data into the DA spaces, followed by a mapping from the DA spaces to the Kreĭn spaces. Extensive experiments on real-world datasets demonstrate the superiority of the proposed kernels.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Gong J, Teng Z, Teng Q, Zhang H, Du L, Chen S, Bhuiyan Z A, Li J, Liu M, Ma H. Hierarchical graph transformer-based deep learning model for large-scale multi-label text classification. IEEE Access, 2020, 8: 30885–30896

    Article  Google Scholar 

  2. Wang Q, Mao Z, Wang B, Guo L. Knowledge graph embedding: a survey of approaches and applications. IEEE Transactions on Knowledge and Data Engineering, 2017, 29(12): 2724–2743

    Article  Google Scholar 

  3. Du X, Xia Y. Natural images enhancement using structure extraction and retinex. In: Proceedings of the 20th International Conference on Advanced Concepts for Intelligent Vision Systems. 2020, 408–420

  4. Kim S, Song C, Jang J, Paik J. Edge-aware image filtering using a structure-guided CNN. IET Image Processing, 2020, 14(3): 472–479

    Article  Google Scholar 

  5. Long J, Feng X, Zhu X, Zhang J, Gou G. Efficient superpixel-guided interactive image segmentation based on graph theory. Symmetry, 2018, 10(5): 169

    Article  Google Scholar 

  6. Wang H, Zhang F, Zhang M, Leskovec J, Zhao M, Li W, Wang Z. Knowledge-aware graph neural networks with label smoothness regularization for recommender systems. In: Proceedings of the 25th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2019, 968–977

  7. Ying R, He R, Chen K, Eksombatchai P, Hamilton W L, Leskovec J. Graph convolutional neural networks for web-scale recommender systems. In: Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining. 2018, 974–983

  8. Lee S, Park S, Kahng M, Lee S G. PathRank: a novel node ranking measure on a heterogeneous graph for recommender systems. In: Proceedings of the 21st ACM International Conference on Information and Knowledge Management. 2012, 1637–1641

  9. Ma R, Fang P, Drummond T, Harandi M. Adaptive poincaré point to set distance for few-shot classification. In: Proceedings of the 36th AAAI Conference on Artificial Intelligence. 2022, 1926–1934

  10. Sun J, Xie Y, Zhang H, Faloutsos C. Less is more: compact matrix decomposition for large sparse graphs. In: Proceedings of the 7th SIAM International Conference on Data Mining. 2007, 366–377

  11. Perozzi B, Al-Rfou R, Skiena S. DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining. 2014, 701–710

  12. Hajiramezanali E, Hasanzadeh A, Duffield N, Narayanan K, Zhou M, Qian X. Variational graph recurrent neural networks. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 960

  13. Linial N, London E, Rabinovich Y. The geometry of graphs and some of its algorithmic applications. Combinatorica, 1995, 15(2): 215–245

    Article  MathSciNet  MATH  Google Scholar 

  14. Krioukov D, Papadopoulos F, Kitsak M, Vahdat A, Boguñá M. Hyperbolic geometry of complex networks. Physical Review E, 2010, 82(3): 036106

    Article  MathSciNet  Google Scholar 

  15. Nickel M, Kiela D. Poincaré embeddings for learning hierarchical representations. In: Proceedings of the 31st International Conference on Neural Information Processing Systems. 2017, 6341–6350

  16. Alanis-Lobato G, Mier P, Andrade-Navarro M A. Efficient embedding of complex networks to hyperbolic space via their Laplacian. Scientific Reports, 2016, 6: 30108

    Article  Google Scholar 

  17. Chamberlain B P, Clough J, Deisenroth M P. Neural embeddings of graphs in hyperbolic space. 2017, arXiv preprint arXiv: 1705.10359

  18. Ganea O E, Bécigneul G, Hofmann T. Hyperbolic entailment cones for learning hierarchical embeddings. In: Proceedings of the 35th International Conference on Machine Learning. 2018, 1632–1641

  19. Nickel M, Kiela D. Learning continuous hierarchies in the Lorentz model of hyperbolic geometry. In: Proceedings of the 35th International Conference on Machine Learning. 2018, 3779–3788

  20. De Sa C, Gu A, Ré C, Sala F. Representation tradeoffs for hyperbolic embeddings. In: Proceedings of Machine Learning Research, 2018, 80: 4460–4469

    Google Scholar 

  21. Suzuki R, Takahama R, Onoda S. Hyperbolic disk embeddings for directed acyclic graphs. In: Proceedings of International Conference on Machine Learning. 2019, 6066–6075

  22. Balažević I, Allen C, Hospedales T. Multi-relational Poincaré graph embeddings. In: Proceedings of the 33rd International Conference on Neural Information Processing Systems. 2019, 401

  23. Sonthalia R, Gilbert A C. Tree! I am no tree! I am a low dimensional hyperbolic embedding. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 72

  24. Weber M, Zaheer M, Rawat A S, Menon A, Kumar S. Robust large-margin learning in hyperbolic space. In: Proceedings of the 34th International Conference on Neural Information Processing Systems. 2020, 1499

  25. Cho H, DeMeo B, Peng J, Berger B. Large-margin classification in hyperbolic space. In: Proceedings of the 22nd International Conference on Artificial Intelligence and Statistics. 2019, 1832–1840

  26. Fang P, Harandi M, Petersson L. Kernel methods in hyperbolic spaces. In: Proceedings of 2021 IEEE/CVF International Conference on Computer Vision (ICCV). 2021, 10665–10674

  27. Rochberg R. Complex hyperbolic geometry and Hilbert spaces with complete pick kernels. Journal of Functional Analysis, 2019, 276(5): 1622–1679

    Article  MathSciNet  MATH  Google Scholar 

  28. Cucerzan S. Large-scale named entity disambiguation based on Wikipedia data. In: Proceedings of 2007 Joint Conference on Empirical Methods in Natural Language Processing and Computational Natural Language Learning (EMNLP-CoNLL). 2007, 708–716

  29. Rozemberczki B, Davies R, Sarkar R, Sutton C. GEMSEC: graph embedding with self clustering. In: Proceedings of 2019 IEEE/ACM International Conference on Advances in Social Networks Analysis and Mining. 2019, 65–72

  30. Zhao B, Sen P, Getoor L. Entity and relationship labeling in affiliation networks. In: Proceedings of ICML Workshop on Statistical Network Analysis. 2006

  31. Shchur O, Mumme M, Bojchevski A, Günnemann S. Pitfalls of graph neural network evaluation. 2018, arXiv preprint arXiv: 1811.05868

  32. Bojchevski A, Günnemann S. Deep Gaussian embedding of graphs: unsupervised inductive learning via ranking. In: Proceedings of the 6th International Conference on Learning Representations. 2018

  33. Parker J R. Notes on complex hyperbolic geometry. Preprint, 2003.

  34. Ratcliffe J G. Foundations of Hyperbolic Manifolds. New York: Springer, 1994

    Book  MATH  Google Scholar 

  35. Goldman W M. Complex Hyperbolic Geometry. Oxford: Oxford University Press, 1999

    MATH  Google Scholar 

  36. Kim J, Scott C D. L2 kernel classification. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2010, 32(10): 1822–1831

    Article  Google Scholar 

  37. Bordelon B, Canatar A, Pehlevan C. Spectrum dependent learning curves in kernel regression and wide neural networks. In: Proceedings of the 37th International Conference on Machine Learning. 2020, 96

  38. Kang Z, Wen L, Chen W, Xu Z. Low-rank kernel learning for graph-based clustering. Knowledge-Based Systems, 2019, 163: 510–517

    Article  Google Scholar 

  39. Ober S W, Rasmussen C E, van der Wilk M. The promises and pitfalls of deep kernel learning. In: Proceedings of the 37th Conference on Uncertainty in Artificial Intelligence. 2021, 1206–1216

  40. Fang P, Zhou J, Roy S K, Ji P, Petersson L, Harandi M. Attention in attention networks for person retrieval. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2022, 44(9): 4626–4641

    Google Scholar 

  41. Ong C S, Mary X, Canu S, Smola A J. Learning with non-positive kernels. In: Proceedings of the 21st International Conference on Machine Learning. 2004

  42. Ungar A A. The hyperbolic Pythagorean theorem in the Poincaré disc model of hyperbolic geometry. The American Mathematical Monthly, 1999, 106(8): 759–763

    MATH  Google Scholar 

  43. Shalit O, Shalit O. Operator theory and function theory in Drury–Arveson space and its quotients. In: Alpay D, ed. Operator Theory. Basel: Springer, 2014, 1–50

    MATH  Google Scholar 

  44. Arcozzi N, Rochberg R, Sawyer E, Wick B D. Distance functions for reproducing kernel Hilbert spaces. Contemp., 2011, 547: 25–53

    MathSciNet  MATH  Google Scholar 

  45. Ungar A A. From Pythagoras to Einstein: the hyperbolic Pythagorean theorem. Foundations of Physics, 1998, 28(8): 1283–1321

    Article  MathSciNet  Google Scholar 

  46. Birman G S, Ungar A A. The hyperbolic derivative in the Poincaré ball model of hyperbolic geometry. Journal of Mathematical Analysis and Applications, 2001, 254(1): 321–333

    Article  MathSciNet  MATH  Google Scholar 

  47. Dray T. The Geometry of Special Relativity. Boca Raton: CRC Press, 2012

    Book  MATH  Google Scholar 

  48. Loosli G, Canu S, Ong C S. Learning SVM in Kreĭn spaces. IEEE Transactions on Pattern Analysis and Machine Intelligence, 2016, 38(6): 1204–1216

    Article  Google Scholar 

  49. Xu H M, Xue H, Chen X H, Wang Y Y. Solving indefinite kernel support vector machine with difference of convex functions programming. In: Proceedings of the 31st AAAI Conference on Artificial Intelligence. 2017, 2782–2788

  50. Oglic D, Gärtner T. Learning in reproducing kernel Krein spaces. In: Proceedings of the International Conference on Machine Learning. 2018, 3856–3864

  51. McAuley J, Targett C, Shi Q, van den Hengel A. Image-based recommendations on styles and substitutes. In: Proceedings of the 38th International ACM SIGIR Conference on Research and Development in Information Retrieval. 2015, 43–52

  52. McCallum A K, Nigam K, Rennie J, Seymore K. Automating the construction of internet portals with machine learning. Information Retrieval, 2000, 3(2): 127–163

    Article  Google Scholar 

  53. Deshmukh A A. Kernel approximation. Stats 608, 2015, 1–3

    Google Scholar 

  54. Platt J. Probabilistic outputs for support vector machines and comparisons to regularized likelihood methods. Advances in Large Margin Classifiers, 1999, 10(3): 61–74

    Google Scholar 

  55. Schölkopf B, Smola A J. Learning with Kernels: support vector machines, regularization, optimization, and beyond. MIT Press, 2002

  56. Chang C C, Lin C J. LIBSVM: a library for support vector machines. ACM Transactions on Intelligent Systems and Technology, 2011, 2(3): 27

    Article  Google Scholar 

Download references

Acknowledgements

This work was supported by the National Natural Science Foundation of China (Grant No. 62076062) and the Fundamental Research Funds for the Central Universities (2242021k30056). Furthermore, it was also supported by the Chollaborative Innovation Center of Wireless Communications Technology.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Hui Xue.

Additional information

Meimei Yang is currently pursuing PhD degree with the School of Computer Science and Engineering, Southeast University, China. She received the BS degree in Nanjing University of Information Science & Technology, China in 2014. Her research interests include pattern recognition and machine learning.

Qiao Liu received the MSc degree in Southeast University of software engineering, China in 2022. He received the BS degree in computer science from Central South University, China in 2019. His current research interests include machine learning and pattern recognition.

Xinkai Sun received the BSc degree in Computer Science from Southeast University, China. In 2021, he received the MSc degree in Southeast University of software engineering, China. His research interests include pattern recognition and machine learning.

Na Shi is currently working in State Grid Zaozhuang Power Supply Company, China. She received the BS degree in Hohai University of computer Science & Technology, China in 2017. In 2020, she received the MSc degree in Southeast University of software engineering, China. Her research interests include pattern recognition and machine learning.

Hui Xue received the BSc degree in Mathematics from Nanjing Normal University, China in 2002. In 2005, she received the MSc degree in Mathematics from Nanjing University of Aeronautics & Astronautics (NUAA). And she also received the PhD degree in Computer Application Technology at NUAA, China in 2008. Since 2009, as a Professor, she has been with the school of Computer Science and Engineering at Southeast University, China. Her research interests include pattern recognition and machine learning.

Electronic supplementary material

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Yang, M., Liu, Q., Sun, X. et al. Towards kernelizing the classifier for hyperbolic data. Front. Comput. Sci. 18, 181301 (2024). https://doi.org/10.1007/s11704-022-2457-y

Download citation

  • Received:

  • Accepted:

  • Published:

  • DOI: https://doi.org/10.1007/s11704-022-2457-y

Keywords

Navigation