Abstract
Graph Neural Networks (GNNs) have achieved notable success in various applications over graph data. However, recent research has revealed that real-world graphs often contain noise, and GNNs are susceptible to noise in the graph. To address this issue, several Graph Structure Learning (GSL) models have been introduced. While GSL models are tailored to enhance robustness against edge noise through edge reconstruction, a significant limitation surfaces: their high reliance on node features. This inherent dependence amplifies their susceptibility to noise within node features. Recognizing this vulnerability, we present DEGNN, a novel GNN model designed to adeptly mitigate noise in both edges and node features. The core idea of DEGNN is to design two separate experts: an edge expert and a node feature expert. These experts utilize self-supervised learning techniques to produce modified edges and node features. Leveraging these modified representations, DEGNN subsequently addresses downstream tasks, ensuring robustness against noise present in both edges and node features of real-world graphs. Notably, the modification process can be trained end-to-end, empowering DEGNN to adjust dynamically and achieves optimal edge and node representations for specific tasks. Comprehensive experiments demonstrate DEGNN’s efficacy in managing noise, both in original real-world graphs and in graphs with synthetic noise.
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
Notes
- 1.
Codes are available at: https://github.com/TaiHasegawa/DEGNN.
References
Zhou, J., et al.: Graph neural networks: a review of methods and applications. AI Open 1, 57–81 (2020)
McPherson, M., Smith-Lovin, L., Cook, J.M.: Birds of a feather: homophily in social networks. Ann. Rev. Sociol. 27(1), 415–444 (2001)
Maurya, S.K., Liu, X., Murata T.: Graph neural networks for fast node ranking approximation. In: TKDD (2021)
Zhang, M., et al.: An end-to-end deep learning architecture for graph classification. In: AAAI (2018)
Chung, F.R.K.: Spectral Graph Theory. number 92. American Mathematical Soc (1997)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: NeurIPS (2016)
Maurya, S.K., Liu, X., Murata, T.: Fast approximations of betweenness centrality with graph neural networks. In: CIKM (2019)
Marsden, P.V.: Network data and measurement. Ann. Rev. Sociol. 16(1), 435–463 (1990)
Dai, H., et al.: Adversarial attack on graph structured data. In: ICML (2018)
Jin, W., et al.: Adversarial attacks and defenses on graphs. In: SIGKDD (2021)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: ICLR (2017)
Hamilton, W., Ying, Z., Leskovec, J.: Inductive representation learning on large graphs. In: NeurIPS (2017)
Veličković, P., Cucurull, G.: Arantxa Casanova. Pietro Lio, and Yoshua Bengio. Graph attention networks. In ICLR, Adriana Romero (2018)
Franceschi, L., Niepert, M., Pontil, M., He, X.: Learning discrete structures for graph neural networks. In: ICML (2019)
Wu, Z., Pan, S., Chen, F., Long, G., Zhang, C., Philip, S.Y.: A comprehensive survey on graph neural networks. IEEE Trans. Neural Netw. Learn. Syst. 32(1), 4–24 (2020)
Jin, R., Xia, T., Liu, X., Murata, T.: Predicting emergency medical service demand with bipartite graph convolutional networks. IEEE Access 9, 9903–9915 (2021)
Fan, W., et al.: Graph neural networks for social recommendation. In: WWW, pp. 417–426 (2019)
Jin, W., et al.: Graph structure learning for robust graph neural networks. In: SIGKDD (2020)
Zhao, T., et al.: Data augmentation for graph neural networks. In: AAAI (2021)
Li, K., et al.: Reliable representations make a stronger defender: unsupervised structure refinement for robust GNN. In: SIGKDD (2022)
Berthelot, D., et al.: MixMatch: a holistic approach to semi-supervised learning. In: NeurIPS (2019)
Zhu, Y., Xu, Y., Yu, F., Liu, Q., Wu, S., Wang, L.: Deep graph contrastive representation learning. In: ICML (2020)
You, Y., Chen, T., Sui, Y., Chen, T., Wang, Z., Shen, Y.: Graph contrastive learning with augmentations. In: NeurIPS (2020)
Maurya, S.K., Liu, X., Murata, T.: Simplifying approach to node classification in graph neural networks. J. Comput. Sci. 62, 101695 (2022)
Zhu, J., Yan, Y., Zhao, L., Heimann, M., Akoglu, L., Koutra, D.: Beyond homophily in graph neural networks: current limitations and effective designs. In: NeurIPS (2020)
Marcheggiani, D., Titov, I.: Encoding sentences with graph convolutional networks for semantic role labeling. In: EMNLP, pp. 1506–1515 (2017)
Rakhimberdina, Z., Liu, X., Murata, T.: Population graph-based multi-model ensemble method for diagnosing autism spectrum disorder. Sensors 20(21), 6001 (2020)
Djenouri, Y., Belhadi, A., Srivastava, G., Lin, J.C.: Hybrid graph convolution neural network and branch-and-bound optimization for traffic flow forecasting. Futur. Gener. Comput. Syst. 139, 100–108 (2023)
Choong, J.J., Liu, X., Murata, T.: Learning community structure with variational autoencoder. In: ICDM, pp. 69–78 (2018)
Pei, H., Wei, B., Kevin, C.-C.C., Yu, L., Yang, B.: Geom-GCN: Geometric graph convolutional networks. In: ICLR (2020)
Suresh, S., Li, P., Hao, C., Neville, J.: Adversarial graph augmentation to improve graph contrastive learning. In: NeurIPS (2021)
Liu, Y., et al.: Graph self-supervised learning: a survey. IEEE Trans. Knowl. Data Eng. 35(6), 5879–5900 (2022)
Shchur, O., Mumme, M., Bojchevski, A., Günnemann, S.: Pitfalls of graph neural network evaluation. In: NeurIPS Workshop (2018)
Zhu, D., Zhang, Z., Cui, P., Zhu, W.: Robust graph convolutional networks against adversarial attacks. In: SIGKDD (2019)
Runwal, B., Kumar, S.: Robust graph neural networks using weighted graph Laplacian (2022). arXiv preprint arXiv:2208.01853
Wu, T., Ren, H., Li, P., Leskovec, J.: Graph information bottleneck. In: NeurIPS (2020)
Zhang, M., Chen, Y.: Link prediction based on graph neural networks. In: NeurIPS (2018)
Acknowledgements
This work is partly supported by JSPS Grant-in-Aid for Scientific Research (grant number 23H03451, 21K12042) and the New Energy and Industrial Technology Development Organization (Grant Number JPNP20017).
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Hasegawa, T., Yun, S., Liu, X., Phua, Y.J., Murata, T. (2024). DEGNN: Dual Experts Graph Neural Network Handling both Edge and Node Feature Noise. In: Yang, DN., Xie, X., Tseng, V.S., Pei, J., Huang, JW., Lin, J.CW. (eds) Advances in Knowledge Discovery and Data Mining. PAKDD 2024. Lecture Notes in Computer Science(), vol 14646. Springer, Singapore. https://doi.org/10.1007/978-981-97-2253-2_30
Download citation
DOI: https://doi.org/10.1007/978-981-97-2253-2_30
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-97-2252-5
Online ISBN: 978-981-97-2253-2
eBook Packages: Computer ScienceComputer Science (R0)