Abstract
Graph convolutional networks (GCNs) are powerful models for graph-structured data learning task. However, most existing GCNs may confront with two major challenges when dealing with heterogeneous graph: (1) Predefined meta-paths are required to capture the semantic relations between nodes from different types, which may not exploit all the useful information in the graph; (2) Performance degradation and semantic confusion may happen with the growth of the network depth, which limits their ability to capture long-range dependencies. To meet these challenges, we propose Dense-HGCN, an end-to-end dense connected heterogeneous convolutional neural network to learn node representation. Dense-HGCN computes the attention weights between different nodes and incorporates the information of previous layers into each layer’s aggregation process via a specific fuse function. Moreover, Dense-HGCN leverages multi-scale information for node classification or other downstream tasks. Experimental results on real-world datasets demonstrate the superior performance of Dense-HGCN in enhancing the representational power compared with several state-of-the-art methods.
The work described in this paper was supported partially by the National Natural Science Foundation of China (12271111), Special Support Plan for High Level Talents of Guangdong Province (2019TQ05X571), Foundation of Guangdong Educational Committee (2019KZDZX1023), Project of Guangdong Province Innovative Team (2020WCXTD011).
Access this chapter
Tax calculation will be finalised at checkout
Purchases are for personal use only
Similar content being viewed by others
References
Ahn, H., Yang, Y., Gan, Q., Moon, T., Wipf, D.P.: Descent steps of a relation-aware energy produce heterogeneous graph neural networks. In: Conference on Neural Information Processing Systems, pp. 38436–38448. NIPS, Curran Associates, Inc. (2022)
Bruna, J., Zaremba, W., Szlam, A., LeCun, Y.: Spectral networks and locally connected networks on graphs. In: International Conference on Learning Representation, ICLR (2014)
Defferrard, M., Bresson, X., Vandergheynst, P.: Convolutional neural networks on graphs with fast localized spectral filtering. In: Conference on Neural Information Processing Systems, pp. 3844–3852. NIPS, Curran Associates Inc., Red Hook (2016)
Dong, Y., Chawla, N.V., Swami, A.: Metapath2vec: scalable representation learning for heterogeneous networks. In: ACM International Conference on Knowledge Discovery and Data Mining, KDD, pp. 135–144. Association for Computing Machinery, New York (2017)
Feng, J., Wang, Z., Li, Y., Ding, B., Wei, Z., Xu, H.: MGMAE: molecular representation learning by reconstructing heterogeneous graphs with A high mask ratio. In: Hasan, M.A., Xiong, L. (eds.) ACM International Conference on Information & Knowledge Management, CIKM, pp. 509–519. ACM (2022)
Fu, X., Zhang, J., Meng, Z., King, I.: MAGNN: metapath aggregated graph neural network for heterogeneous graph embedding. In: International World Wide Web Conference, WWW, pp. 2331–2341. Association for Computing Machinery, New York (2020)
Hamilton, W.L., Ying, R., Leskovec, J.: Inductive representation learning on large graphs. In: Conference on Neural Information Processing Systems, pp. 1025–1035. NIPS, Curran Associates Inc., Red Hook (2017)
Huang, G., Liu, Z., Van Der Maaten, L., Weinberger, K.Q.: Densely connected convolutional networks. In: IEEE Conference on Computer Vision and Pattern Recognition, CVPR, pp. 2261–2269 (2017)
Ji, H., Wang, X., Shi, C., Wang, B., Yu, P.S.: Heterogeneous graph propagation network. IEEE Trans. Knowl. Data Eng. 35(1), 521–532 (2023)
Kingma, D.P., Ba, J.: Adam: a method for stochastic optimization. In: International Conference on Learning Representations, ICLR (2015)
Kipf, T.N., Welling, M.: Semi-supervised classification with graph convolutional networks. In: International Conference on Learning Representations, ICLR (2017)
Liu, J., Kawaguchi, K., Hooi, B., Wang, Y., Xiao, X.: EIGNN: efficient infinite-depth graph neural networks. In: Conference on Neural Information Processing Systems, NIPS, pp. 18762–18773 (2021)
Luan, S., Zhao, M., Chang, X.W., Precup, D.: Break the ceiling: stronger multi-scale deep graph convolutional networks. In: Conference on Neural Information Processing SystemsNIPS, , pp. 10943–10953. Curran Associates, Inc. (2019)
Sun, Y., Han, J.: Mining heterogeneous information networks: a structural analysis approach. ACM SIGKDD Explor. Newsl. 14(2), 20–28 (2013)
Sun, Y., Han, J., Yan, X., Yu, P.S., Wu, T.: Pathsim: meta path-based top-k similarity search in heterogeneous information networks. Proc. VLDB Endow. 4(11), 992–1003 (2011)
Vaswani, A., et al.: Attention is all you need. In: Conference on Neural Information Processing Systems, NIPS, pp. 6000–6010. Curran Associates Inc., Red Hook (2017)
Veličković, P., Cucurull, G., Casanova, A., Romero, A., Lió, P., Bengio, Y.: Graph attention networks. In: International Conference on Learning Representations, ICLR (2018)
Wang, X., et al.: Heterogeneous graph attention network. In: World Wide Web Conference, WWW, pp. 2022–2032. Association for Computing Machinery, New York (2019)
Xiong, Z., Cai, J.: Deep heterogeneous graph neural networks via similarity regularization loss and hierarchical fusion. In: IEEE International Conference on Data Mining Workshops, ICDMW, pp. 759–768(2022)
Xu, K., Li, C., Tian, Y., Sonobe, T., Kawarabayashi, K., Jegelka, S.: Representation learning on graphs with jumping knowledge networks. In: International Conference on Machine Learning, ICML, vol. 80, pp. 5449–5458. PMLR (2018)
Yang, Y., Guan, Z., Li, J., Zhao, W., Cui, J., Wang, Q.: Interpretable and efficient heterogeneous graph convolutional network. IEEE Trans. Knowl. Data Eng. 35(2), 1637–1650 (2023)
Yun, S., Jeong, M., Kim, R., Kang, J., Kim, H.J.: Graph transformer networks. In: Neural Information Processing Systems. Curran Associates Inc., Red Hook (2019)
Author information
Authors and Affiliations
Corresponding author
Editor information
Editors and Affiliations
Rights and permissions
Copyright information
© 2024 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.
About this paper
Cite this paper
Yan, R., Cai, J. (2024). An End-to-End Dense Connected Heterogeneous Graph Convolutional Neural Network. In: Luo, B., Cheng, L., Wu, ZG., Li, H., Li, C. (eds) Neural Information Processing. ICONIP 2023. Lecture Notes in Computer Science, vol 14447. Springer, Singapore. https://doi.org/10.1007/978-981-99-8079-6_36
Download citation
DOI: https://doi.org/10.1007/978-981-99-8079-6_36
Published:
Publisher Name: Springer, Singapore
Print ISBN: 978-981-99-8078-9
Online ISBN: 978-981-99-8079-6
eBook Packages: Computer ScienceComputer Science (R0)