{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2025,2,21]],"date-time":"2025-02-21T07:19:36Z","timestamp":1740122376089,"version":"3.37.3"},"reference-count":39,"publisher":"Springer Science and Business Media LLC","issue":"1","license":[{"start":{"date-parts":[[2022,11,9]],"date-time":"2022-11-09T00:00:00Z","timestamp":1667952000000},"content-version":"tdm","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"},{"start":{"date-parts":[[2022,11,9]],"date-time":"2022-11-09T00:00:00Z","timestamp":1667952000000},"content-version":"vor","delay-in-days":0,"URL":"https:\/\/creativecommons.org\/licenses\/by\/4.0"}],"funder":[{"name":"USYD-Data61 Collaborative Research Project"},{"DOI":"10.13039\/501100000923","name":"Australian Research Council","doi-asserted-by":"publisher","award":["DP210101347"],"id":[{"id":"10.13039\/501100000923","id-type":"DOI","asserted-by":"publisher"}]},{"DOI":"10.13039\/501100004543","name":"China Scholarship Council","doi-asserted-by":"crossref","award":["201806070131"],"id":[{"id":"10.13039\/501100004543","id-type":"DOI","asserted-by":"crossref"}]}],"content-domain":{"domain":["link.springer.com"],"crossmark-restriction":false},"short-container-title":["Data Min Knowl Disc"],"published-print":{"date-parts":[[2023,1]]},"abstract":"Abstract<\/jats:title>Graph neural networks (GNNs) have achieved state-of-the-art results for semi-supervised node classification on graphs. Nevertheless, the challenge of how to effectively learn GNNs with very few labels is still under-explored. As one of the prevalent semi-supervised methods, pseudo-labeling has been proposed to explicitly address the label scarcity problem. It is the process of augmenting the training set with pseudo-labeled unlabeled nodes to retrain a model in a self-training cycle. However, the existing pseudo-labeling approaches often suffer from two major drawbacks. First, these methods conservatively expand the label set by selecting only high-confidence unlabeled nodes without assessing their informativeness. Second, these methods incorporate pseudo-labels to the same loss function with genuine labels, ignoring their distinct contributions to the classification task. In this paper, we propose a novel informative pseudo-labeling framework (InfoGNN) to facilitate learning of GNNs with very few labels. Our key idea is to pseudo-label the most informative nodes that can maximally represent the local neighborhoods via mutual information maximization. To mitigate the potential label noise and class-imbalance problem arising from pseudo-labeling, we also carefully devise a generalized cross entropy with a class-balanced regularization to incorporate pseudo-labels into model retraining. Extensive experiments on six real-world graph datasets validate that our proposed approach significantly outperforms state-of-the-art baselines and competitive self-supervised methods on graphs.<\/jats:p>","DOI":"10.1007\/s10618-022-00879-4","type":"journal-article","created":{"date-parts":[[2022,11,9]],"date-time":"2022-11-09T13:02:58Z","timestamp":1667998978000},"page":"228-254","update-policy":"https:\/\/doi.org\/10.1007\/springer_crossmark_policy","source":"Crossref","is-referenced-by-count":14,"title":["Informative pseudo-labeling for graph neural networks with few labels"],"prefix":"10.1007","volume":"37","author":[{"given":"Yayong","family":"Li","sequence":"first","affiliation":[]},{"ORCID":"https:\/\/orcid.org\/0000-0002-2063-8437","authenticated-orcid":false,"given":"Jie","family":"Yin","sequence":"additional","affiliation":[]},{"given":"Ling","family":"Chen","sequence":"additional","affiliation":[]}],"member":"297","published-online":{"date-parts":[[2022,11,9]]},"reference":[{"unstructured":"Belghazi MI, Baratin A, Rajeshwar S, Ozair S, Bengio Y, Courville A, Hjelm D : Mutual information neural estimation. In: International conference on machine learning, pp 531\u2013540. PMLR (2018)","key":"879_CR1"},{"unstructured":"Bojchevski A, G\u00fcnnemann S (2018) Deep gaussian embedding of graphs: unsupervised inductive learning via ranking. In: International conference on learning representations","key":"879_CR2"},{"issue":"2","key":"879_CR3","doi-asserted-by":"crossref","first-page":"211","DOI":"10.1111\/j.2517-6161.1964.tb00553.x","volume":"26","author":"GE Box","year":"1964","unstructured":"Box GE, Cox DR (1964) An analysis of transformations. J R Stat Soc: Ser B (Methodol) 26(2):211\u2013243","journal-title":"J R Stat Soc: Ser B (Methodol)"},{"doi-asserted-by":"crossref","unstructured":"Ding K, Wang J, Li J, Shu K, Liu C, Liu H (2020) Graph prototypical networks for few-shot learning on attributed networks. In: Proceedings of the 29th ACM international conference on information and knowledge management, pp 295\u2013304","key":"879_CR4","DOI":"10.1145\/3340531.3411922"},{"unstructured":"Goldberger J, Ben-Reuven E (2016) Training deep neural-networks using a noise adaptation layer. In: International conference on learning representations","key":"879_CR5"},{"unstructured":"Hamilton W, Ying Z, Leskovec J : Inductive representation learning on large graphs. In: Advances in neural information processing systems, pp 1024\u20131034 (2017)","key":"879_CR6"},{"unstructured":"Han B, Yao Q, Yu X, Niu G, Xu M, Hu W, Tsang I, Sugiyama M (2018) Co-teaching: Robust training of deep neural networks with extremely noisy labels. In: Advances in neural information processing systems, pp 8527\u20138537","key":"879_CR7"},{"unstructured":"Hassani K, Khasahmadi AH (2020) Contrastive multi-view representation learning on graphs. In: International conference on machine learning, pp 4116\u20134126. PMLR","key":"879_CR8"},{"unstructured":"Hjelm RD, Fedorov A, Lavoie-Marchildon S, Grewal K, Bachman P, Trischler A, Bengio Y (2019) Learning deep representations by mutual information estimation and maximization. In: International conference on learning representations","key":"879_CR9"},{"unstructured":"Huang K, Zitnik M (2020) Graph meta learning via local subgraphs. In: Advances in neural information processing systems, pp 5862\u20135874","key":"879_CR10"},{"unstructured":"Kim D, Oh A (2021) How to find your friendly neighborhood: graph attention design with self-supervision. In: International conference on learning representations","key":"879_CR11"},{"unstructured":"Kipf TN, Welling M (2017) Semi-supervised classification with graph convolutional networks. In: International conference on learning representations","key":"879_CR12"},{"unstructured":"Lee DH, et\u00a0al (2013) Pseudo-label: The simple and efficient semi-supervised learning method for deep neural networks. In: Workshop on challenges in representation learning. International conference on machine learning","key":"879_CR13"},{"doi-asserted-by":"crossref","unstructured":"Li J, Wong Y, Zhao Q, Kankanhalli MS (2019) Learning to learn from noisy labeled data. In: Proceedings of the IEEE\/CVF conference on computer vision and pattern recognition, pp 5051\u20135059","key":"879_CR14","DOI":"10.1109\/CVPR.2019.00519"},{"doi-asserted-by":"crossref","unstructured":"Li Q, Han Z, Wu XM (2018) Deeper insights into graph convolutional networks for semi-supervised learning. In: Proceedings of the AAAI conference on artificial intelligence, pp 3538\u20133545","key":"879_CR15","DOI":"10.1609\/aaai.v32i1.11604"},{"doi-asserted-by":"crossref","unstructured":"Liu Y, Jin M, Pan S, Zhou C, Zheng Y, Xia F, Yu P (2022) Graph self-supervised learning: a survey. IEEE Trans Knowl Data Eng","key":"879_CR16","DOI":"10.1109\/TKDE.2022.3172903"},{"unstructured":"Mernyei P, Cangea C (2020) Wiki-cs: A wikipedia-based benchmark for graph neural networks. arXiv preprint arXiv:2007.02901","key":"879_CR17"},{"unstructured":"Nowozin S, Cseke B, Tomioka R : f-gan: Training generative neural samplers using variational divergence minimization. In: Advances in Neural Information Processing Systems, pp. 271\u2013279 (2016)","key":"879_CR18"},{"doi-asserted-by":"crossref","unstructured":"Peng Z, Huang W, Lu, M, Zheng Q, Rong Y, Xu T, Huang J (2020) Graph representation learning via graphical mutual information maximization. In: The web conference, pp 259\u2013270","key":"879_CR19","DOI":"10.1145\/3366423.3380112"},{"doi-asserted-by":"crossref","unstructured":"Qiu J, Chen Q, Dong Y, Zhang J, Yang H, Ding M, Wang K, Tang J (2020) Gcc: Graph contrastive coding for graph neural network pre-training. In: Proceedings of the 26th ACM SIGKDD international conference on knowledge discovery and data mining, pp 1150\u20131160","key":"879_CR20","DOI":"10.1145\/3394486.3403168"},{"unstructured":"Qu M, Bengio Y, Tang J (2019) Gmnn: Graph markov neural networks. In: International conference on machine learning, pp 5241\u20135250. PMLR","key":"879_CR21"},{"unstructured":"Ren M, Zeng W, Yang B, Urtasun R (2018) Learning to reweight examples for robust deep learning. In: International conference on machine learning, pp 4334\u20134343","key":"879_CR22"},{"doi-asserted-by":"crossref","unstructured":"Rosenberg C, Hebert M, Schneiderman H (2005) Semi-supervised self-training of object detection models. In: 2005 Seventh IEEE workshops on applications of computer vision 1, pp 29\u201336","key":"879_CR23","DOI":"10.1109\/ACVMOT.2005.107"},{"unstructured":"Shchur O, Mumme M, Bojchevski A, G\u00fcnnemann S (2018) Pitfalls of graph neural network evaluation. In: Relational representation learning workshop, advances in neural information processing systems","key":"879_CR24"},{"unstructured":"Snell J, Swersky K, Zemel R (2017) Prototypical networks for few-shot learning. In: Advances in neural information processing systems, pp 4077\u20134087","key":"879_CR25"},{"unstructured":"Sukhbaatar S, Bruna J, Paluri M, Bourdev L, Fergus R (2015) Training convolutional networks with noisy labels. In: International conference on learning representations workshop","key":"879_CR26"},{"doi-asserted-by":"crossref","unstructured":"Sun K, Lin Z, Zhu Z (2020) Multi-stage self-supervised learning for graph convolutional networks on graphs with few labeled nodes. In: Proceedings of the AAAI conference on artificial intelligence, pp 5892\u20135899","key":"879_CR27","DOI":"10.1609\/aaai.v34i04.6048"},{"unstructured":"Van\u00a0Rooyen B, Menon A, Williamson RC (2015) Learning with symmetric label noise: the importance of being unhinged. In: Advances in neural information processing systems, pp 10\u201318","key":"879_CR28"},{"unstructured":"Veli\u010dkovi\u0107 P, Cucurull G, Casanova A, Romero A, Lio P, Bengio Y (2018) Graph attention networks. In: International conference on learning representations","key":"879_CR29"},{"unstructured":"Velickovic P, Fedus W, Hamilton WL, Li\u00f2 P, Bengio Y, Hjelm RD (2019) Deep graph infomax. In: International conference on learning representations","key":"879_CR30"},{"doi-asserted-by":"crossref","unstructured":"Verma V, Qu M, Lamb A, Bengio Y, Kannala J, Tang J : Graphmix: Regularized training of graph neural networks for semi-supervised learning. In: Proceedings of the AAAI conference on artificial intelligence, pp 10024\u201310032 (2020)","key":"879_CR31","DOI":"10.1609\/aaai.v35i11.17203"},{"unstructured":"Wu F, Souza A, Zhang T, Fifty C, Yu T, Weinberger K : Simplifying graph convolutional networks. In: International conference on machine learning, pp 6861\u20136871. PMLR (2019)","key":"879_CR32"},{"unstructured":"Xiao T, Xia T, Yang Y, Huang C, Wang X (2015) Learning from massive noisy labeled data for image classification. In: Proceedings of the IEEE conference on computer vision and pattern recognition, pp 2691\u20132699","key":"879_CR33"},{"unstructured":"You Y, Chen T, Sui Y, Chen T, Wang Z, Shen Y (2020) Graph contrastive learning with augmentations. In: Advances in neural information processing systems, pp 5812\u20135823","key":"879_CR34"},{"unstructured":"You Y, Chen T, Wang Z, Shen Y (2020) When does self-supervision help graph convolutional networks?. In: International conference on machine learning, pp 10871\u201310880. PMLR","key":"879_CR35"},{"key":"879_CR36","doi-asserted-by":"publisher","first-page":"837","DOI":"10.1016\/j.future.2020.10.016","volume":"115","author":"K Zhan","year":"2021","unstructured":"Zhan K, Niu C (2021) Mutual teaching for graph convolutional networks. Futur Gener Comput Syst 115:837\u2013843","journal-title":"Futur Gener Comput Syst"},{"unstructured":"Zhang Z, Sabuncu M (2018) Generalized cross entropy loss for training deep neural networks with noisy labels. In: Advances in neural information processing systems, pp 8778\u20138788 (2018)","key":"879_CR37"},{"unstructured":"Zhou Z, Zhang S, Huang Z (2019) Dynamic self-training framework for graph convolutional networks. arXiv preprint arXiv:1910.02684","key":"879_CR38"},{"unstructured":"Zhu Y, Xu Y, Yu F, Liu Q, Wu S, Wang L (2020) Deep graph contrastive representation learning. arXiv preprint arXiv:2006.04131","key":"879_CR39"}],"container-title":["Data Mining and Knowledge Discovery"],"original-title":[],"language":"en","link":[{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10618-022-00879-4.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/article\/10.1007\/s10618-022-00879-4\/fulltext.html","content-type":"text\/html","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/link.springer.com\/content\/pdf\/10.1007\/s10618-022-00879-4.pdf","content-type":"application\/pdf","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2024,10,7]],"date-time":"2024-10-07T20:17:28Z","timestamp":1728332248000},"score":1,"resource":{"primary":{"URL":"https:\/\/link.springer.com\/10.1007\/s10618-022-00879-4"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2022,11,9]]},"references-count":39,"journal-issue":{"issue":"1","published-print":{"date-parts":[[2023,1]]}},"alternative-id":["879"],"URL":"https:\/\/doi.org\/10.1007\/s10618-022-00879-4","relation":{},"ISSN":["1384-5810","1573-756X"],"issn-type":[{"type":"print","value":"1384-5810"},{"type":"electronic","value":"1573-756X"}],"subject":[],"published":{"date-parts":[[2022,11,9]]},"assertion":[{"value":"15 December 2021","order":1,"name":"received","label":"Received","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"11 September 2022","order":2,"name":"accepted","label":"Accepted","group":{"name":"ArticleHistory","label":"Article History"}},{"value":"9 November 2022","order":3,"name":"first_online","label":"First Online","group":{"name":"ArticleHistory","label":"Article History"}}]}}