Turn off MathJax
Article Contents
Binhao HU, Jianpeng ZHANG, and Hongchang CHEN, “Knowledge Graph Completion Method of Combining Structural Information with Semantic Information,” Chinese Journal of Electronics, vol. x, no. x, pp. 1–9, xxxx doi: 10.23919/cje.2022.00.299
Citation: Binhao HU, Jianpeng ZHANG, and Hongchang CHEN, “Knowledge Graph Completion Method of Combining Structural Information with Semantic Information,” Chinese Journal of Electronics, vol. x, no. x, pp. 1–9, xxxx doi: 10.23919/cje.2022.00.299

Knowledge Graph Completion Method of Combining Structural Information with Semantic Information

doi: 10.23919/cje.2022.00.299
More Information
  • Author Bio:

    Binhao HU was born in 1996. He received the B.E. degree in Electronic Engineering from Sichuan University. He is a M.S. candidate of Zhengzhou University. His research interests include graph representation, knowledge graph and NLP.(Email: hu15181620732@163.com)

    Jianpeng ZHANG (Ph.D, Eindhoven University of Technology, 2018) is an Assistant Professor in the National Digital Switching System Engineering & Technological R&D Center (NDSC), China. His research interests include data mining, big data analytics and social network analysis.(Email: zjp@ndsc.com.cn)

    Hongchang CHEN is currently a Professor at National Digital Switching System Engineering and Technological Research Center. His research interests include digital society governance and data mining.(Email: chenhongchang@ndsc.com.cn)

  • Corresponding author: Email: zjp@ndsc.com.cn
  • Received Date: 2022-09-02
  • Accepted Date: 2023-12-19
  • Available Online: 2024-03-06
  • With the development of knowledge graphs, a series of applications based on knowledge graphs have emerged. The incompleteness of knowledge graphs makes the effect of the downstream applications and affected by the quality of the knowledge graphs. To improve the quality of knowledge graphs, translation-based graph embeddings, such as TransE, learn structural information by rep-resenting triples as low-dimensional dense vectors. However, is difficult to generalize to unseen entities, Entities that are not observed during training but appear during testing. The other methods use the powerful representational ability of pre-trained language models to learn entity descriptions and contextual representation of triples. Although they are robust to incompleteness, but they need to calculate the score of all candidate entities for each triple during inference. We consider combining two models to enhance the robustness of unseen entities by semantic information, and prevent combined explosion by reducing inference overhead through structured information. We use a pre-training language model to code triples and learn the semantic information within them, and then use a hyperbolic space-based distance model to learn structural information and integrate the two types of information together. We evaluate our model by performing link prediction experi-ments in standard datasets. In experiments, our model achieves better performances than state-of-the-art meth-ods on two standard datasets.
  • loading
  • [1]
    S. X. Ji, S. R. Pan, E. Cambria, et al., “A survey on knowledge graphs: Representation, acquisition, and applications,” IEEE Transactions on Neural Networks and Learning Systems, vol. 33, no. 2, pp. 494–514, 2021. doi: 10.1109/TNNLS.2021.3070843
    [2]
    S. Auer, C. Bizer, G. Kobilarov, et al., “DBpedia: A nucleus for A web of open data,” in Proceedings of the 6th International Semantic Web Conference on the Semantic Web, Busan, Korea, pp. 722–735, 2007.
    [3]
    T. P. Tanon, G. Weikum, and F. Suchanek, “YAGO 4: A reason-able knowledge base,” in Proceedings of the 17th International Conference on the Semantic Web, Heraklion, Crete, pp. 583–596, 2020.
    [4]
    A. Bordes, N. Usunier, A. Garcia-Durán, et al., “Translating embeddings for modeling multi-relational data,” in Proceedings of the 26th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, pp. 2787–2795, 2013.
    [5]
    J. W. Zhang, H. P. Zhang, C. Y. Xia, et al., “Graph-Bert: Only attention is needed for learning graph representations,” arXiv preprint, arXiv: 2001.05140, 2020.
    [6]
    L. Yao, C. S. Mao, and Y. Luo, “KG-BERT: BERT for knowledge graph completion,” arXiv preprint, arXiv: 1909.03193, 2019.
    [7]
    Z. Q. Sun, Z. H. Deng, J. Y. Nie, et al., “RotatE: Knowledge graph embedding by relational rotation in complex space,” in Proceedings of the 7th International Conference on Learning Representations, New Orleans, LA, USA, 2019.
    [8]
    I. Balažević, C. Allen, and T. Hospedales, “Multi-relational Poincaré graph embeddings,” in Proceedings of the 33rd International Conference on Neural Information Processing Systems, Vancouver, BC, Canada, article no. 401, 2019.
    [9]
    B. S. Yang, W. T. Yi, X. D. He, et al., “Embedding entities and relations for learning and inference in knowledge bases,” in Proceedings of the 3rd International Conference on Learning Representations, San Diego, CA, USA.
    [10]
    T. Trouillon, J. Welbl, S. Riedel, et al., “Complex embeddings for simple link prediction,” in Proceedings of the 33rd International Conference on International Conference on Machine Learning, New York, NY, USA, pp. 2071–2080, 2016.
    [11]
    M. M. Bronstein, J. Bruna, Y. Lecun, et al., “Geometric deep learning: Going beyond Euclidean data,” IEEE Signal Processing Magazine, vol. 34, no. 4, pp. 18–42, 2017. doi: 10.1109/MSP.2017.2693418
    [12]
    B. Wang, T. Shen, G. D. Long, et al., “Structure-augmented text representation learning for efficient knowledge graph completion,” in Proceedings of Web Conference 2021, Ljubljana, Slovenia, pp. 1737–1748, 2021.
    [13]
    M. Nickel, V. Tresp, and H. P. Kriegel, “A three-way model for collective learning on multi-relational data,” in Proceedings of the 28th International Conference on International Conference on Machine Learning, Bellevue, WA, USA, pp. 809–816, 2011.
    [14]
    S. Z. He, K. Liu, G. L. Ji, et al., “Learning to represent knowledge graphs with Gaussian embedding,” in Proceedings of the 24th ACM International on Conference on Information and Knowledge Management, Melbourne, Australia, pp. 623–632, 2015.
    [15]
    R. Socher, D. Q. Chen, C. D. Manning, et al., “Reasoning with neural tensor networks for knowledge base completion,” in Proceedings of the 26th International Conference on Neural Information Processing Systems, Lake Tahoe, NV, USA, pp. 926–934, 2013.
    [16]
    Z. Wang, J. W. Zhang, J. L. Feng, et al., “Knowledge graph and text jointly embedding,” in Proceedings of the 2014 Conference on Empirical Methods in Natural Language Processing (EMNLP), Doha, Qatar, pp. 1591–1601, 2014.
    [17]
    H. Xiao, M. L. Huang, L. Meng, et al., “SSP: Semantic space projection for knowledge graph embedding with text descriptions,” in Proceedings of the Thirty-First AAAI Conference on Artificial Intelligence, San Francisco, CA, USA, pp. 3104–3110, 2017.
    [18]
    Z. G. Wang, J. Z. Li, Z. Y. Liu, et al., “SSP: Semantic space projection for knowledge graph embedding with text descriptions,” in Proceedings of International Joint Conference on Artificial Intelligent (IJCAI), pp. 4–17, 2016.
    [19]
    J. Devlin, M. W. Chang, K. Lee, et al., “BERT: Pre-training of deep bidirectional transformers for language understanding,” in Proceedings of 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies, Volume 1, Minneapolis, MN, USA, pp. 4171–4186, 2018.
    [20]
    A. Vaswani, N. Shazeer, N. Parmar, et al., “Attention is all you need,” in Proceedings of the 31st International Conference on Neural Information Processing Systems, Long Beach, CA, USA, pp. 6000–6010, 2017.
    [21]
    I. Chami, A. Wolf, D. C. Juan, et al., “Low-dimensional hyperbolic knowledge graph embeddings,” in Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics, Online, pp. 6901–6914, 2020.
    [22]
    A. B. Adcock, B. D. Sullivan, and M. W. Mahoney, “Tree-like structure in large social and information networks,” in Proceedings of the IEEE 13th International Conference on Data Mining, Dallas, TX, USA, pp. 1–10, 2013.
    [23]
    O. E. Ganea, G. Bécigneul, and T. Hofmann, “Hyperbolic neural networks,” in Proceedings of the 32nd International Conference on Neural Information Processing Systems, Montréal, Canada, pp. 5350–5360, 2018.
    [24]
    K. Toutanova, D. Q Chen, P. Pantel, et al., “Representing text for joint embedding of text and knowledge bases,” in Proceedings of 2015 Conference on Empirical Methods in Natural Language Processing, Lisbon, Portugal, pp. 1499–1509, 2015.
    [25]
    T. Dettmers, P. Minervini, P. Stenetorp, et al., “Convolutional 2D knowledge graph embeddings,” in Proceedings of the Thirty-Second AAAI Conference on Artificial Intelligence, New Orleans, LA, USA, pp. 1811–1818, 2018.
    [26]
    C. Fellbaum and G. Miller, WordNet: An Electronic Lexical Database. MIT Press, Cambridge, MA, USA, 1998.
    [27]
    D. Nathani, J. Chauhan, C. Sharma, et al., “learning attention-based embeddings for relation prediction in knowledge graphs,” in Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics, Florence, Italy, pp. 4710–4723, 2019.
  • 加载中

Catalog

    通讯作者: 陈斌, bchen63@163.com
    • 1. 

      沈阳化工大学材料科学与工程学院 沈阳 110142

    1. 本站搜索
    2. 百度学术搜索
    3. 万方数据库搜索
    4. CNKI搜索

    Figures(2)  / Tables(4)

    Article Metrics

    Article views (25) PDF downloads(9) Cited by()
    Proportional views
    Related

    /

    DownLoad:  Full-Size Img  PowerPoint
    Return
    Return