Volume 31 Issue 5
Sep.  2022
Turn off MathJax
Article Contents
LIN Jingjing, YE Zhonglin, ZHAO Haixing, et al., “DeepHGNN: A Novel Deep Hypergraph Neural Network,” Chinese Journal of Electronics, vol. 31, no. 5, pp. 958-968, 2022, doi: 10.1049/cje.2021.00.108
Citation: LIN Jingjing, YE Zhonglin, ZHAO Haixing, et al., “DeepHGNN: A Novel Deep Hypergraph Neural Network,” Chinese Journal of Electronics, vol. 31, no. 5, pp. 958-968, 2022, doi: 10.1049/cje.2021.00.108

DeepHGNN: A Novel Deep Hypergraph Neural Network

doi: 10.1049/cje.2021.00.108
Funds:  This work was supported by the National Key R&D Program of China (2020YFC1523300), the Youth Program of Natural Science Foundation of Qinghai Province (2021-ZJ-946Q), and the Middle-Youth Program of Natural Science Foundation of Qinghai Normal University (2020QZR007).
More Information
  • Author Bio:

    was born in 1986. She received the M.S. degree from Chengdu University of Technology in 2015. She is currently pursuing the Ph.D. degree at the School of Computer in Qinghai Normal University. Her research interests include graph neural networks and hypergraph neural networks.(Email: ljj_mail@126.com)

    was born in 1989. He received the B.S. degree from Sichuan University in 2012. He received the M.S. degree from Southwest Jiaotong University in 2016, and the Ph.D. degree at the School of Computer Science in Shaanxi Normal University in 2019. His research interests include graph neural networks, knowledge extraction, and network representation learning. (Email: zhonglin_ye@foxmail.com)

    (corresponding author) was born in 1969. He received his Doctor of Engineering Degree from the School of Computer Science in Northwestern Polytechnical University in 2004. He also received his Doctor of Science Degree from University of Twente in Holland. He is a Professor and part-time Professor at Qinghai Normal University and Shaanxi Normal University, respectively. He is a Director of Changjiang Scholars and Innovative Research Team in University. He is also the Syndic of Operations Research Society, Combinatorics and Graph Theory Society in China. His research interests include complex network, graph neural networks, machine translation, hypergraph theory, network reliability, etc. (Email: h.x.zhao@163.com)

    was born in 1996. He received a bachelor’s degree in information and computing science from Dezhou College. He is now a graduate student in the School of Mathematics and Statistics of Qinghai Normal University. His research interests include graph neural networks and hypergraph coloring entropy. (Email: lusheng_fang@126.com)

  • Received Date: 2021-03-29
  • Accepted Date: 2021-06-25
  • Available Online: 2021-11-02
  • Publish Date: 2022-09-05
  • With the development of deep learning, graph neural networks (GNNs) have yielded substantial results in various application fields. GNNs mainly consider the pair-wise connections and deal with graph-structured data. In many real-world networks, the relations between objects are complex and go beyond pair-wise. Hypergraph is a flexible modeling tool to describe intricate and higher-order correlations. The researchers have been concerned how to develop hypergraph-based neural network model. The existing hypergraph neural networks show better performance in node classification tasks and so on, while they are shallow network because of over-smoothing, over-fitting and gradient vanishment. To tackle these issues, we present a novel deep hypergraph neural network (DeepHGNN). We design DeepHGNN by using the technologies of sampling hyperedge, residual connection and identity mapping, residual connection and identity mapping bring from graph convolutional neural networks. We evaluate DeepHGNN on two visual object datasets. The experiments show the positive effects of DeepHGNN, and it works better in visual object classification tasks.
  • loading
  • [1]
    J. Bruna, W. Zaremba, A. Szlam, et al., “Spectral networks and locally connected networks on graphs,” available at: https://arxiv.org/pdf/1312.6203.pdf, 2014-5-21.
    [2]
    P. Veličković, G. Cucurull, A. Casanova, et al., “Graph attention networks,” available at: https://arxiv.org/abs/1710.10903v1.pdf, 2018-2-4.
    [3]
    Z. L. Ye, H. X. Zhao, Y. Zhu, et al., “HSNR: A network representation learning algorithm using hierarchical structure embedding,” Chinese Journal of Electronics, vol.29, no.6, pp.1141–1152, 2019.
    [4]
    T. N. Kipf and M. Welling, “Variational graph auto-encoders,” available at: https://arxiv.org/pdf/1611.07308.pdf, 2016-11-21.
    [5]
    M. H. Zhang and Y. X. chen, “Link prediction based on graph neural networks,” Thirty-second Conference on Neural Information Processing Systems, Montréal, Canada, pp.5171–5181, 2018.
    [6]
    Z. L. Ye, H. X. Zhao, K. Zhang, et al., “Tri-party deep network representation learning using inductive matrix completion,” Journal of Central South University, vol.26, no.10, pp.2746–2758, 2019. doi: 10.1007/s11771-019-4210-8
    [7]
    Z. H. Wu, S. R. Pan, F. W. Chen, et al., “A comprehensive survey on graph neural networks,” IEEE Transactions on Neural Networks and Learning Systems, vol.32, no.1, pp.4–24, 2021. doi: 10.1109/TNNLS.2020.2978386
    [8]
    B. B. Xu, K. T. Cen, J. J. Huang, et al., “A survey on graph convolutional neural network,” Chinese Journal of Computers, vol.43, no.5, pp.755–780, 2020. (in Chinese)
    [9]
    Y. F. Feng, H. X. You, Z. Z. Zhang, et al., “Hypergraph neural networks,” in Proceedings of the AAAI Conference on Artificial Intelligence, Vol.33, No.01, pp.3558–3565, 2019.
    [10]
    J. W. Jiang, Y. X. Wei, Y. F. Feng, et al., “Dynamic hypergraph neural networks,” in Proceedings of the Twenty-Eighth International Joint Conference on Artificial Intelligence, Macao, China, pp.2635–2641, 2019.
    [11]
    S. Bai, F. H. Zhang, and P. H. S Torr, “Hypergraph convolution and hypergraph attention,” Pattern Recognition, vol.110, article no.107637, 2021. doi: 10.1016/j.patcog.2020.107637
    [12]
    N.Yadati, M.Nimishakavi, P.Yadav, et al., “HyperGCN: A new method of training graph convolutional networks on hypergraphs,” Thirty-third Conference on Neural Information Processing Systems, Vancouver, Canada, pp.1509–1520, 2019.
    [13]
    R. C. Zhang, Y. S. Zou and J. Ma, “Hyper-SAGNN: A self-attention based graph neural network for hypergraphs,” International Conference on Learning Representations, Addis Ababa, Ethiopia, pp.1–7, 2020.
    [14]
    J. Yi and J. Park, “Hypergraph convolutional recurrent neural network,” in Proceedings of the 26th ACM SIGKDD International Conference on Knowledge Discovery and Data Mining, Virtual Event, pp.3366–3376, 2020.
    [15]
    X. G. Sun, H. Z. Yin, B. Liu, et al., “Heterogeneous hypergraph embedding for graph classification,” available at: https://arxiv.org/pdf/2010.10728.pdf, 2021-1-18.
    [16]
    E. S. Kim, W. Y. Kang, K. W. On, et al., “Hypergraph attention networks for multimodal learning,” 2020 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Seattle, USA, pp.14569–14578, 2020.
    [17]
    Y. B. Zhang, N. Wang, Y. F. Chen, et al., “Hypergraph label propagation network,” in Proceedings of the AAAI Conference on Artificial Intelligence, vol.34, no.04, pp.6885–6892, 2020.
    [18]
    X. P. Wu, Q. C. Chen, W. Li, et al., “AdaHGNN: Adaptive hypergraph neural networks for multi-label image classification,” in Proceedings of the 28th ACM International Conference on Multimedia, Seattle, USA, pp.284–293, 2020.
    [19]
    M. Lostar and I. Rekik, “Deep hypergraph U-Net for brain graph embedding and classification,” available at: https://arxiv.org/pdf/2008.13118.pdf, 2020-8-30.
    [20]
    M. Defferrard, X. Bresson, and P. Vandergheynst, “Convolutional neural networks on graphs with fast localized spectral filtering,” in Proceedings of the 30th International Conference on Neural Information Processing Systems, Barcelona, Spain, pp.3844–3852, 2016.
    [21]
    T. N. Kipf and M. Welling, “Semi-supervised classification with graph convolutional networks,” available at: https://arxiv.org/pdf/1609.02907v4.pdf, 2017-2-22.
    [22]
    R. Y. Li, S. Wang, F. Y. Zhu, et al., “Adaptive graph convolutional neural networks,” Proceedings of the AAAI Conference on Artificial Intelligence, vol.32, no.1, pp.3546–3553, 2018.
    [23]
    C. Y. Zhuang and Q. Ma, “Dual graph convolutional networks for graph-based semi-supervised classification,” in Proceedings of the 2018 World Wide Web Conference, Lyon, France, pp.499–508, 2018.
    [24]
    F. Wu, A. H. Souza, T. Y. Zhang, et al., “Simplifying graph convolutional networks,” Proceedings of the 36th International Conference on Machine Learning, Vol.97, pp.6861–6871, 2019.
    [25]
    Y. Feng, M. Gai, F. H. Wang, et al., “Classification and early warning model of terrorist attacks based on optimal GCN,” Chinese Journal of Electronics, vol.29, no.6, pp.1193–1200, 2020. doi: 10.1049/cje.2020.10.005
    [26]
    F. Lyu, L. Y. Li, Q. M. Fu, et al., “Multi-label image classification via Coarse-to-Fine attention,” Chinese Journal of Electronics, vol.28, no.6, pp.1118–1126, 2019. doi: 10.1049/cje.2019.07.015
    [27]
    J. N. Zhang, X. J. Shi, J. Y. Xie, et al., “GaAN: Gated attention networks for learning on large and spatiotemporal graphs,” in Proceedings of the Thirty-Fourth Conference on Uncertainty in Artificial Intelligence, Monterey, USA, pp.339–349, 2018.
    [28]
    W. L. Hamilton, R. Ying, and J. Leskovec, “Inductive representation learning on large graphs,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, California, USA, pp.1025–1035, 2017.
    [29]
    J. Chen, T. F. Ma, and C. Xiao, “FastGCN: Fast learning with graph convolutional networks via importance sampling,” The sixth International Conference on Learning Representations, Vancouver, Canada, arXiv:1801.10247, 2018.
    [30]
    J. Gilmer, S. S. Schoenholz, P. F. Riley, et al., “Neural message passing for quantum chemistry,” in Proceedings of the 34th International Conference on Machine Learning, Sydney, Australia, vol.70, pp.1263–1272, 2017.
    [31]
    F. Monti, D. Boscaini, J. Masci, et al., “Geometric deep learning on graphs and manifolds using mixture model CNNs,” 2017 IEEE Conference on Computer Vision and Pattern Recognition, Honolulu, USA, pp.5425–5434, 2017.
    [32]
    Z. Z. Zhang, H. J. Lin, and Y. Gao, “Dynamic hypergraph structure learning,” in Proceedings of the Twenty-Seventh International Joint Conference on Artificial Intelligence, Stockholm, Sweden, pp.3162–3169, 2018.
    [33]
    S. Bandyopadhyay, K. Das, and M. N. Murty, “Line hypergraph convolution network: Applying graph convolution for hypergraphs,” available at: https://arxiv.org/pdf/2002.03392.pdf, 2020-2-9.
    [34]
    L. H. Tran and L. H. Tran, “Directed hypergraph neural network,” available at: https://arxiv.org/ftp/arxiv/papers/2008/2008.03626.pdf, 2020-8-9.
    [35]
    K. Z. Ding, J. L. Wang, J. D. Li, et al., “Be more with less: Hypergraph attention networks for inductive text classification,” in Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing, Online, pp.4927–4936, 2020.
    [36]
    Q. M. Li, Z. C. Han, and X. M. Wu, “Deeper insights into graph convolutional networks for semi-supervised learning,” available at: https://arxiv.org/pdf/1801.07606.pdf, 2018-1-22.
    [37]
    J. Klicpera, A. Bojchevski, and S. Günnemann, “Predict then propagate: Graph neural networks meet personalized pagerank,” International Conference on Learning Representations, New Orleans, USA, pp.1–15, 2019.
    [38]
    K. Y. Xu, C. T. Li, Y. L. Tian, et al., “Representation learning on graphs with jumping knowledge network,” The Thirty-fifth International Conference on Machine Learning, Stockholm, Sweden, pp.1–14, 2018.
    [39]
    G. H. Li, M. Müller, A. Thabet, et al., “Deepgcns: Can GCNs go as deep as CNNs?” 2019 IEEE/CVF International Conference on Computer Vision (ICCV), Seoul, Korea (South), pp.9266–9275, 2019.
    [40]
    Y. Rong, W. B. Huang, T. Y. Xu, et al., “DropEdge: Towards deep graph convolutional networks on node classification,” International Conference on Learning Representations, Addis Ababa, Ethiopia, pp.1–13, 2020.
    [41]
    M. Chen, Z. W. Wei, Z. F. Huang, et al., “Simple and deep graph convolutional networks,” in Proceedings of the 37th International Conference on Machine Learning, Vol.119, pp.1725–1735, 2020.
    [42]
    K. M. He, X. Y. Zhang, S. Q. Ren, et al., “Deep residual learning for image recognition,” 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, USA, pp.770−778, 2016.
    [43]
    D. Y. Chen, X. P. Tian Y. T. Shen, et al., “On visual similarity based 3D model retrieval,” Computer Graphics Forum, Wiley Online Library, vol.22, pp.223–232, 2003.
    [44]
    Z. R. Wu, S. R. Song, A. Khosla, et al., “3D ShapeNets: A deep representation for volumetric shapes,” in Proceedings of 28th IEEE Conference on Computer Vision and Pattern Recognition, Boston, USA, pp.1912–1920, 2015.
    [45]
    H. Su, S. Maji, E. Kalogerakis, et al., “Multi-view convolutional neural networks for 3D shape recognition,” 2015 IEEE International Conference on Computer Vision (ICCV), Santiago, Chile, pp.945–953, 2015.
    [46]
    Y. F. Feng, Z. Z. Zhang, X. B. Zhao, et al., “GVCNN: Group-view convolutional neural networks for 3D shape recognition,” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, pp.264–272, 2018.
    [47]
    J. X. Li, B. M. Chen, and G. H. Lee, “SO-Net: Self-organizing network for point cloud analysis,” 2018 IEEE/CVF Conference on Computer Vision and Pattern Recognition, Salt Lake City, USA, pp.9397–9406, 2018.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(5)  / Tables(4)

    Article Metrics

    Article views (1474) PDF downloads(144) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return