{"status":"ok","message-type":"work","message-version":"1.0.0","message":{"indexed":{"date-parts":[[2024,9,6]],"date-time":"2024-09-06T20:54:09Z","timestamp":1725656049522},"reference-count":0,"publisher":"Association for the Advancement of Artificial Intelligence (AAAI)","issue":"10","content-domain":{"domain":[],"crossmark-restriction":false},"short-container-title":["AAAI"],"abstract":"Recently, graph neural networks for semi-supervised classification have been widely studied. However, existing methods only use the information of limited neighbors and do not deal with the inter-class connections in graphs. In this paper, we propose Adaptive aggregation with Class-Attentive Diffusion (AdaCAD), a new aggregation scheme that adaptively aggregates nodes probably of the same class among K-hop neighbors. To this end, we first propose a novel stochastic process, called Class-Attentive Diffusion (CAD), that strengthens attention to intra-class nodes and attenuates attention to inter-class nodes. In contrast to the existing diffusion methods with a transition matrix determined solely by the graph structure, CAD considers both the node features and the graph structure with the design of our class-attentive transition matrix that utilizes a classifier. Then, we further propose an adaptive update scheme that leverages different reflection ratios of the diffusion result for each node depending on the local class-context. As the main advantage, AdaCAD alleviates the problem of undesired mixing of inter-class features caused by discrepancies between node labels and the graph topology. Built on AdaCAD, we construct a simple model called Class-Attentive Diffusion Network (CAD-Net). Extensive experiments on seven benchmark datasets consistently demonstrate the efficacy of the proposed method and our CAD-Net significantly outperforms the state-of-the-art methods. Code is available at https:\/\/github.com\/ljin0429\/CAD-Net.<\/jats:p>","DOI":"10.1609\/aaai.v35i10.17043","type":"journal-article","created":{"date-parts":[[2022,9,8]],"date-time":"2022-09-08T19:22:29Z","timestamp":1662664949000},"page":"8601-8609","source":"Crossref","is-referenced-by-count":7,"title":["Class-Attentive Diffusion Network for Semi-Supervised Classification"],"prefix":"10.1609","volume":"35","author":[{"given":"Jongin","family":"Lim","sequence":"first","affiliation":[]},{"given":"Daeho","family":"Um","sequence":"additional","affiliation":[]},{"given":"Hyung Jin","family":"Chang","sequence":"additional","affiliation":[]},{"given":"Dae Ung","family":"Jo","sequence":"additional","affiliation":[]},{"given":"Jin Young","family":"Choi","sequence":"additional","affiliation":[]}],"member":"9382","published-online":{"date-parts":[[2021,5,18]]},"container-title":["Proceedings of the AAAI Conference on Artificial Intelligence"],"original-title":[],"link":[{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/17043\/16850","content-type":"application\/pdf","content-version":"vor","intended-application":"text-mining"},{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/download\/17043\/16850","content-type":"unspecified","content-version":"vor","intended-application":"similarity-checking"}],"deposited":{"date-parts":[[2022,9,8]],"date-time":"2022-09-08T19:22:31Z","timestamp":1662664951000},"score":1,"resource":{"primary":{"URL":"https:\/\/ojs.aaai.org\/index.php\/AAAI\/article\/view\/17043"}},"subtitle":[],"short-title":[],"issued":{"date-parts":[[2021,5,18]]},"references-count":0,"journal-issue":{"issue":"10","published-online":{"date-parts":[[2021,5,28]]}},"URL":"https:\/\/doi.org\/10.1609\/aaai.v35i10.17043","relation":{},"ISSN":["2374-3468","2159-5399"],"issn-type":[{"value":"2374-3468","type":"electronic"},{"value":"2159-5399","type":"print"}],"subject":[],"published":{"date-parts":[[2021,5,18]]}}}