Nearly Tight Spectral Sparsification of Directed Hypergraphs

Nearly Tight Spectral Sparsification of Directed Hypergraphs

Authors Kazusato Oko, Shinsaku Sakaue, Shin-ichi Tanigawa



PDF
Thumbnail PDF

File

LIPIcs.ICALP.2023.94.pdf
  • Filesize: 0.78 MB
  • 19 pages

Document Identifiers

Author Details

Kazusato Oko
  • Department of Mathematical Informatics, The University of Tokyo, Japan
  • Center for Advanced Intelligence Project, RIKEN, Tokyo, Japan
Shinsaku Sakaue
  • Department of Mathematical Informatics, The University of Tokyo, Japan
Shin-ichi Tanigawa
  • Department of Mathematical Informatics, The University of Tokyo, Japan

Cite As Get BibTex

Kazusato Oko, Shinsaku Sakaue, and Shin-ichi Tanigawa. Nearly Tight Spectral Sparsification of Directed Hypergraphs. In 50th International Colloquium on Automata, Languages, and Programming (ICALP 2023). Leibniz International Proceedings in Informatics (LIPIcs), Volume 261, pp. 94:1-94:19, Schloss Dagstuhl – Leibniz-Zentrum für Informatik (2023) https://doi.org/10.4230/LIPIcs.ICALP.2023.94

Abstract

Spectral hypergraph sparsification, an attempt to extend well-known spectral graph sparsification to hypergraphs, has been extensively studied over the past few years. For undirected hypergraphs, Kapralov, Krauthgamer, Tardos, and Yoshida (2022) have proved an ε-spectral sparsifier of the optimal O^*(n) size, where n is the number of vertices and O^* suppresses the ε^{-1} and log n factors. For directed hypergraphs, however, the optimal sparsifier size has not been known. Our main contribution is the first algorithm that constructs an O^*(n²)-size ε-spectral sparsifier for a weighted directed hypergraph. Our result is optimal up to the ε^{-1} and log n factors since there is a lower bound of Ω(n²) even for directed graphs. We also show the first non-trivial lower bound of Ω(n²/ε) for general directed hypergraphs. The basic idea of our algorithm is borrowed from the spanner-based sparsification for ordinary graphs by Koutis and Xu (2016). Their iterative sampling approach is indeed useful for designing sparsification algorithms in various circumstances. To demonstrate this, we also present a similar iterative sampling algorithm for undirected hypergraphs that attains one of the best size bounds, enjoys parallel implementation, and can be transformed to be fault-tolerant.

Subject Classification

ACM Subject Classification
  • Theory of computation → Sparsification and spanners
Keywords
  • Spectral sparsification
  • (Directed) hypergraphs
  • Iterative sampling

Metrics

  • Access Statistics
  • Total Accesses (updated on a weekly basis)
    0
    PDF Downloads

References

  1. Noga Alon and Joel H. Spencer. The Probabilistic Method. John Wiley & Sons, 2016. Google Scholar
  2. Alexandr Andoni, Jiecao Chen, Robert Krauthgamer, Bo Qin, David P. Woodruff, and Qin Zhang. On sketching quadratic forms. In Proceedings of the 2016 ACM Conference on Innovations in Theoretical Computer Science, pages 311-319, 2016. Google Scholar
  3. Nikhil Bansal, Ola Svensson, and Luca Trevisan. New notions and constructions of sparsification for graphs and hypergraphs. In Proceedings of the 2019 IEEE 60th Annual Symposium on Foundations of Computer Science, pages 910-928. IEEE, 2019. Google Scholar
  4. András A. Benczúr and David R. Karger. Approximating s-t minimum cuts in Õ(n²) time. In Proceedings of the 28th Annual ACM Symposium on Theory of Computing, pages 47-55. ACM, 1996. Google Scholar
  5. Manuel Blum, Robert W. Floyd, Vaughan R. Pratt, Ronald L. Rivest, and Robert Endre Tarjan. Time bounds for selection. Journal of Computer and System Sciences, 7(4):448-461, 1973. Google Scholar
  6. Charles Carlson, Alexandra Kolla, Nikhil Srivastava, and Luca Trevisan. Optimal lower bounds for sketching graph cuts. In Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms, pages 2565-2569. SIAM, 2019. Google Scholar
  7. T.-H. Hubert Chan, Zhihao Gavin Tang, Xiaowei Wu, and Chenzi Zhang. Diffusion operator and spectral analysis for directed hypergraph Laplacian. Theoretical Computer Science, 784:46-64, 2019. Google Scholar
  8. Yu Chen, Sanjeev Khanna, and Ansh Nagda. Near-linear size hypergraph cut sparsifiers. In Proceedings of the 2020 IEEE 61st Annual Symposium on Foundations of Computer Science, pages 61-72. IEEE, 2020. Google Scholar
  9. Michael B. Cohen, Jonathan Kelner, John Peebles, Richard Peng, Anup B. Rao, Aaron Sidford, and Adrian Vladu. Almost-linear-time algorithms for Markov chains and new spectral primitives for directed graphs. In Proceedings of the 49th Annual ACM SIGACT Symposium on Theory of Computing, pages 410-419. ACM, 2017. Google Scholar
  10. Satoru Fujishige and Sachin B. Patkar. Realization of set functions as cut functions of graphs and hypergraphs. Discrete Mathematics, 226(1-3):199-210, 2001. Google Scholar
  11. Giorgio Gallo, Giustino Longo, Stefano Pallottino, and Sang Nguyen. Directed hypergraphs and applications. Discrete Applied Mathematics, 42(2-3):177-201, 1993. Google Scholar
  12. Matthias Hein, Simon Setzer, Leonardo Jost, and Syama Sundar Rangapuram. The total variation on hypergraphs - learning on hypergraphs revisited. In Advances in Neural Information Processing Systems, volume 26. Curran Associates, Inc., 2013. Google Scholar
  13. Arun Jambulapati, Yang P Liu, and Aaron Sidford. Chaining, group leverage score overestimates, and fast spectral hypergraph sparsification. arXiv:2209.10539, 2022. Google Scholar
  14. Mohammad Ali Javidian, Zhiyu Wang, Linyuan Lu, and Marco Valtorta. On a hypergraph probabilistic graphical model. Annals of Mathematics and Artificial Intelligence, 88(9):1003-1033, 2020. Google Scholar
  15. Michael Kapralov, Robert Krauthgamer, Jakab Tardos, and Yuichi Yoshida. Towards tight bounds for spectral sparsification of hypergraphs. In Proceedings of the 53rd Annual ACM SIGACT Symposium on Theory of Computing, pages 598-611. ACM, 2021. Google Scholar
  16. Michael Kapralov, Robert Krauthgamer, Jakab Tardos, and Yuichi Yoshida. Spectral hypergraph sparsifiers of nearly linear size. In Proceedings of the 2021 IEEE 62nd Annual Symposium on Foundations of Computer Science, pages 1159-1170, 2022. Google Scholar
  17. Ioannis Koutis and Shen Chen Xu. Simple parallel and distributed algorithms for spectral graph sparsification. ACM Transactions on Parallel Computing, 3(2):1-14, 2016. Google Scholar
  18. James R. Lee. Spectral hypergraph sparsification via chaining. arXiv:2209.04539, 2022. Google Scholar
  19. Pan Li, Niao He, and Olgica Milenkovic. Quadratic decomposable submodular function minimization: Theory and practice. Journal of Machine Learning Research, 21(106):1-49, 2020. Google Scholar
  20. Kazusato Oko, Shinsaku Sakaue, and Shin ichi Tanigawa. Nearly tight spectral sparsification of directed hypergraphs by a simple iterative sampling algorithm. arXiv:2204.02537, 2022. Google Scholar
  21. Akbar Rafiey and Yuichi Yoshida. Sparsification of decomposable submodular functions. In Proceedings of the 36th AAAI Conference on Artificial Intelligence, volume 36, pages 10336-10344, 2022. Google Scholar
  22. Tasuku Soma and Yuichi Yoshida. Spectral sparsification of hypergraphs. In Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms, pages 2570-2581. SIAM, 2019. Google Scholar
  23. Daniel A. Spielman. Spectral and Algebraic Graph Theory, 2019. (visited on 02/03/2022). URL: http://cs-www.cs.yale.edu/homes/spielman/sagt/sagt.pdf.
  24. Daniel A. Spielman and Shang-Hua Teng. Spectral sparsification of graphs. SIAM Journal on Computing, 40(4):981-1025, 2011. Google Scholar
  25. Yuuki Takai, Atsushi Miyauchi, Masahiro Ikeda, and Yuichi Yoshida. Hypergraph clustering based on pagerank. In Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery & Data Mining, pages 1970-1978. ACM, 2020. Google Scholar
  26. Shang-Hua Teng. Scalable algorithms for data and network analysis. Foundations and Trends in Theoretical Computer Science, 12(1-2):1-274, 2016. URL: https://doi.org/10.1561/0400000051.
  27. Nisheeth K. Vishnoi. Lx = b. Foundations and Trends in Theoretical Computer Science, 8(1-2):1-141, 2013. URL: https://doi.org/10.1561/0400000054.
  28. Naganand Yadati, Madhav Nimishakavi, Prateek Yadav, Vikram Nitin, Anand Louis, and Partha Talukdar. HyperGCN: A new method of training graph convolutional networks on hypergraphs. In Advances in Neural Information Processing Systems, volume 32. Curran Associates, Inc., 2019. Google Scholar
  29. Naganand Yadati, Vikram Nitin, Madhav Nimishakavi, Prateek Yadav, Anand Louis, and Partha Talukdar. NHP: Neural hypergraph link prediction. In Proceedings of the 29th ACM International Conference on Information & Knowledge Management, pages 1705-1714. ACM, 2020. Google Scholar
  30. Yuichi Yoshida. Cheeger inequalities for submodular transformations. In Proceedings of the 30th Annual ACM-SIAM Symposium on Discrete Algorithms, pages 2582-2601. SIAM, 2019. Google Scholar
  31. Chenzi Zhang, Shuguang Hu, Zhihao Gavin Tang, and T.-H. Hubert Chan. Re-revisiting learning on hypergraphs: Confidence interval, subgradient method, and extension to multiclass. IEEE Transactions on Knowledge and Data Engineering, 32(3):506-518, 2020. Google Scholar
  32. Chunjiang Zhu, Sabine Storandt, Kam-Yiu Lam, Song Han, and Jinbo Bi. Improved dynamic graph learning through fault-tolerant sparsification. In Proceedings of the 36th International Conference on Machine Learning, pages 7624-7633. PMLR, 2019. Google Scholar
Questions / Remarks / Feedback
X

Feedback for Dagstuhl Publishing


Thanks for your feedback!

Feedback submitted

Could not send message

Please try again later or send an E-mail