Hyperbolic node embedding for temporal networks | Data Mining and Knowledge Discovery Skip to main content
Log in

Hyperbolic node embedding for temporal networks

  • Published:
Data Mining and Knowledge Discovery Aims and scope Submit manuscript

Abstract

Generating general-purpose vector representations of networks allows us to analyze them without the need for extensive feature-engineering. Recent works have shown that the hyperbolic space can naturally represent the structure of networks, and that embedding networks into hyperbolic space is extremely efficient, especially in low dimensions. However, the existing hyperbolic embedding methods apply to static networks and cannot capture the dynamic evolution of the nodes and edges of a temporal network. In this paper, we present an unsupervised framework that uses temporal random walks to obtain training samples with both temporal and structural information to learn hyperbolic embeddings from continuous-time dynamic networks. We also show how the framework extends to attributed and heterogeneous information networks. Through experiments on five publicly available real-world temporal datasets, we show the efficacy of our model in embedding temporal networks in low-dimensional hyperbolic space compared to several other unsupervised baselines. We show that our model obtains state-of-the-art performance in low dimensions, outperforming all baselines, and has competitive performance in higher dimensions, outperforming the baselines in three of the five datasets. Our results show that embedding temporal networks in hyperbolic space is extremely effective when necessitating low dimensions.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1
Fig. 2
Fig. 3
Fig. 4
Fig. 5
Fig. 6
Fig. 7

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. http://konect.uni-koblenz.de/networks/ca-cit-HepPh.

  2. https://snap.stanford.edu/data/CollegeMsg.html.

  3. https://snap.stanford.edu/data/email-Eu-core-temporal.html.

  4. https://github.com/urielsinger/Datasets.

  5. http://dblp.uni-trier.de/xml.

  6. https://github.com/uoguelph-mlrg/LDG.

References

  • Alanis-Lobato G, Mier P, Andrade-Navarro MA (2016a) Efficient embedding of complex networks to hyperbolic space via their Laplacian. Sci Rep 6:30108

    Article  Google Scholar 

  • Alanis-Lobato G, Mier P, Andrade-Navarro MA (2016b) Manifold learning and maximum likelihood estimation for hyperbolic network embedding. Appl Netw Sci 1(1):10

    Article  Google Scholar 

  • Atias N, Sharan R (2012) Comparative analysis of protein networks: hard problems, practical solutions. Commun ACM 55(5):88–97

    Article  Google Scholar 

  • Cao S, Lu W, Xu Q (2015) Grarep: learning graph representations with global structural information. In: Proceedings of the 24th ACM international conference on information and knowledge management, pp 891–900

  • Cao S, Lu W, Xu Q (2016) Deep neural networks for learning graph representations. In: Thirtieth AAAI conference on artificial intelligence

  • Chamberlain BP, Clough J, Deisenroth MP (2017) Neural embeddings of graphs in hyperbolic space. arXiv preprint arXiv:1705.10359

  • Chami I, Ying Z, Ré C, Leskovec J (2019) Hyperbolic graph convolutional neural networks. In: Advances in neural information processing systems, pp 4869–4880

  • Cho H, DeMeo B, Peng J, Berger B (2019) Large-margin classification in hyperbolic space. In: The 22nd international conference on artificial intelligence and statistics, pp 1832–1840

  • De Sa C, Gu A, Ré C, Sala F (2018) Representation tradeoffs for hyperbolic embeddings. Proc Mach Learn Res 80:4460

    Google Scholar 

  • Ganea OE, Bécigneul G, Hofmann T (2018) Hyperbolic entailment cones for learning hierarchical embeddings. arXiv preprint arXiv:1804.01882

  • Ghosh S, Viswanath B, Kooti F, Sharma NK, Korlam G, Benevenuto F, Ganguly N, Gummadi KP (2012) Understanding and combating link farming in the twitter social network. In: Proceedings of the 21st international conference on world wide web, pp 61–70

  • Goyal P, Kamra N, He X, Liu Y (2018) DynGEM: Deep embedding method for dynamic graphs. arXiv preprint arXiv:1805.11273

  • Grover A, Leskovec J (2016) node2vec: scalable feature learning for networks. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 855–864

  • Hawkes AG (1971) Spectra of some self-exciting and mutually exciting point processes. Biometrika 58(1):83–90

    Article  MathSciNet  Google Scholar 

  • Hummon NP, Dereian P (1989) Connectivity in a citation network: the development of DNA theory. Soc Netw 11(1):39–63

    Article  Google Scholar 

  • Jin D, Heimann M, Rossi RA, Koutra D (2019) Node2bits: compact time-and attribute-aware node representations for user stitching. In: Joint European conference on machine learning and knowledge discovery in databases. Springer, pp 483–506

  • Kipf TN, Welling M (2016) Semi-supervised classification with graph convolutional networks. arXiv preprint arXiv:1609.02907

  • Knyazev B, Augusta C, Taylor GW (2019) Learning temporal attention in dynamic graphs with bilinear interactions. arXiv preprint arXiv:1909.10367

  • Krioukov D, Papadopoulos F, Kitsak M, Vahdat A, Boguná M (2010) Hyperbolic geometry of complex networks. Phys Rev E 82(3):036106

    Article  MathSciNet  Google Scholar 

  • Li AQ, Ahmed A, Ravi S, Smola AJ (2014) Reducing the sampling complexity of topic models. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 891–900

  • Li T, Zhang J, Philip SY, Zhang Y, Yan Y (2018) Deep dynamic network embedding for link prediction. IEEE Access 6:29219–29230

    Article  Google Scholar 

  • Li Z, Zhang L, Song G (2019) Sepne: bringing separability to network embedding. Proc AAAI Conf Artif Intell 33:4261–4268

    Google Scholar 

  • Liu Q, Nickel M, Kiela D (2019) Hyperbolic graph neural networks. In: Advances in neural information processing systems, pp 8228–8239

  • Lorrain F, White HC (1971) Structural equivalence of individuals in social networks. J Math Sociol 1(1):49–80

    Article  Google Scholar 

  • Lu Y, Wang X, Shi C, Yu PS, Ye Y (2019) Temporal network embedding with micro-and macro-dynamics. In: Proceedings of the 28th ACM international conference on information and knowledge management, pp 469–478

  • McDonald D, He S (2019) HEAT: hyperbolic embedding of attributed networks. arXiv preprint arXiv:1903.03036

  • Mikolov T, Sutskever I, Chen K, Corrado GS, Dean J (2013) Distributed representations of words and phrases and their compositionality. In: Advances in neural information processing systems, pp 3111–3119

  • Muscoloni A, Thomas JM, Ciucci S, Bianconi G, Cannistraci CV (2017) Machine learning meets complex networks via coalescent embedding in the hyperbolic space. Nat Commun 8(1):1615

    Article  Google Scholar 

  • Nguyen GH, Lee JB, Rossi RA, Ahmed NK, Koh E, Kim S (2018) Continuous-time dynamic network embeddings. In: Companion proceedings of the the web conference

  • Nickel M, Kiela D (2017) Poincaré embeddings for learning hierarchical representations. In: Advances in neural information processing systems, pp 6338–6347

  • Ou M, Cui P, Pei J, Zhang Z, Zhu W (2016) Asymmetric transitivity preserving graph embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 1105–1114

  • Pareja A, Domeniconi G, Chen J, Ma T, Suzumura T, Kanezashi H, Kaler T, Leisersen CE (2019) Evolvegcn: evolving graph convolutional networks for dynamic graphs. arXiv preprint arXiv:1902.10191

  • Perozzi B, Al-Rfou R, Skiena S (2014) DeepWalk: online learning of social representations. In: Proceedings of the 20th ACM SIGKDD international conference on knowledge discovery and data mining, pp 701–710

  • Rowe R, Creamer G, Hershkop S, Stolfo SJ (2007) Automated social hierarchy detection through email network analysis. In: Proceedings of the 9th WebKDD and 1st SNA-KDD 2007 workshop on web mining and social network analysis, pp 109–117

  • Sankar A, Wu Y, Gou L, Zhang W, Yang H (2020) DySAT: deep neural representation learning on dynamic graphs via self-attention networks. In: Proceedings of the 13th international conference on web search and data mining, pp 519–527

  • Schönemann PH (1966) A generalized solution of the orthogonal procrustes problem. Psychometrika 31(1):1–10

    Article  MathSciNet  Google Scholar 

  • Singer U, Guy I, Radinsky K (2019) Node embedding over temporal graphs. In: Proceedings of the twenty-eighth international joint conference on artificial intelligence, IJCAI-19, pp 4605–4612

  • Tang J, Qu M, Wang M, Zhang M, Yan J, Mei Q (2015) LINE: large-scale information network embedding. In: Proceedings of the 24th international conference on world wide web, pp 1067–1077

  • Tong Z, Liang Y, Sun C, Li X, Rosenblum D, Lim A (2020a) Digraph inception convolutional networks. Advances in neural information processing systems

  • Tong Z, Liang Y, Sun C, Rosenblum DS, Lim A (2020b) Directed graph convolutional network. arXiv preprint arXiv:2004.13970

  • Trivedi R, Farajtabar M, Biswal P, Zha H (2019) DyRep: learning representations over dynamic graphs. In: International conference on learning representations

  • Veličković P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2017) Graph attention networks. arXiv preprint arXiv:1710.10903

  • Wang D, Cui P, Zhu W (2016) Structural deep network embedding. In: Proceedings of the 22nd ACM SIGKDD international conference on knowledge discovery and data mining, pp 1225–1234

  • Wang L, Lu Y, Huang C, Vosoughi S (2020) Embedding node structural role identity into hyperbolic space. In: Proceedings of the 29th ACM international conference on information and knowledge management, pp 2253–2256

  • Wang L, Gao C, Huang C, Liu R, Ma W, Vosoughi S (2021) Embedding heterogeneous networks into hyperbolic space without meta-path. In: Proceedings of the AAAI conference on artificial intelligence

  • Wang S, Tang J, Aggarwal C, Chang Y, Liu H (2017) Signed network embedding in social media. In: Proceedings of the 2017 SIAM international conference on data mining

  • Wang X, Zhang Y, Shi C (2019) Hyperbolic heterogeneous information network embedding. Proc AAAI Conf Artif Intell 33:5337–5344

    Google Scholar 

  • Wilson B, Leimeister M (2018) Gradient descent in hyperbolic space. arXiv preprint arXiv:1805.08207

  • Zhang J, Ackerman MS, Adamic L (2007) Expertise networks in online communities: structure and algorithms. In: Proceedings of the 16th international conference on world wide web, pp 221–230

  • Zhang Y, Wang X, Jiang X, Shi C, Ye Y (2019) Hyperbolic graph attention network. arXiv preprint arXiv:1912.03046

  • Zhou L, Yang Y, Ren X, Wu F, Zhuang Y (2018) Dynamic network embedding by modeling triadic closure process. In: Thirty-second AAAI conference on artificial intelligence

  • Zuo Y, Liu G, Lin H, Guo J, Hu X, Wu J (2018) Embedding temporal network via neighborhood formation. In: Proceedings of the 24th ACM SIGKDD international conference on knowledge discovery and data mining, pp 2857–2866

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Soroush Vosoughi.

Additional information

Responsible editor: Evangelos Papalexakis.

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Appendices

Appendix A: Detailed experimental results

In Table 6 below, we show the detailed results of the experiments shown in Fig. 2. Recall that these are the temporal link prediction experiments in which Euclidean baselines threshold on inner products. As before, all results are averaged over 10 runs. We show the standard deviation and the statistical significance of the results. For each setting (i.e., dataset and dimension), the statistical significance between the performance of our model (Temp-Hyper) and every other baseline was calculated using a t-test. We denoted statistical significance (i.e., \(p < 0.05\)) differences between a baseline and our model with a *.

Table 6 AUC performance of our model (Temp-Hyper, which thresholds on hyperboloid distances) vs Euclidean baselines (which threshold on Euclidean inner product) for the temporal link prediction task (2 to 20 dimensions). * corresponds to statistically significant differences between a baseline and our method (\(p < 0.05\) using t-test). Our model significantly outperforms other models in all dimensions from 2 to 128. All the results are averaged over 10 runs. The standard deviations of the results are also shown

Appendix B: Higher dimensions

As discussed in the paper, the advantage of our proposed hyperbolic representation learning method for temporal networks is that it is more efficient than Euclidean methods with respect to dimensionality, in that the representations learned by our method in lower dimensions are comparable to those learned in higher dimensions and significantly outperform the low-dimensional representations learned by Euclidean methods. The purpose of this section is to explore whether the performance boost of our method in lower dimensions is severely diminished in higher dimensions. Tables 7, and 8 below show the results of our temporal link prediction experiments on 128 dimensions for our method and all baselines. Specifically, Table 7 shows the results for the first experiment where we rely solely on the learned representations (to showcase the advantage of our hyperbolic approach over Euclidean baselines), and Table 8 shows the results for the second experiment where use edge feature-engineering and logistic regression.

As Table 7 shows, in the first experiment our model still significantly (\(p < 0.05\)) outperforms all Euclidean baselines with relatively large margins. Moreover, our model is one of the more stable ones with relatively low standard deviations across datasets.

Table 8 highlights the limitations of our model in higher dimensions. Though still the best model for some of the datasets and comparable for the rest, the advantage of our model in lower dimensions does diminish in 128 dimensions. This points to our model not being ideal in larger dimensions. However, as seen here, in cases where low dimensionality is important, our model is the best candidate, and even in larger dimensions, our model’s performance is not far off from the top-performing models.

Table 7 AUC performance of our model (Temp-Hyper, which thresholds on hyperboloid distances) vs Euclidean baselines (which threshold on Euclidean inner product) for the temporal link prediction task in 128 dimensions. * corresponds to statistically significant differences between a baseline and our method (\(p < 0.05\) using t-test). The standard deviations of the results are also shown
Table 8 AUC performance of our model (Temp-Hyper, which thresholds on hyperboloid distances) vs all the baselines with best settings for the temporal link prediction task in 128 dimensions. * corresponds to statistically significant differences between a baseline and our method (\(p < 0.05\) using t-test). All the results are averaged after 10 runs. The standard deviations of the results are also shown

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Wang, L., Huang, C., Ma, W. et al. Hyperbolic node embedding for temporal networks. Data Min Knowl Disc 35, 1906–1940 (2021). https://doi.org/10.1007/s10618-021-00774-4

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10618-021-00774-4

Keywords

Navigation