A Novel Approach to Analyzing Defects: Enhancing Knowledge Graph Embedding Models for Main Electrical Equipment | SpringerLink
Skip to main content

A Novel Approach to Analyzing Defects: Enhancing Knowledge Graph Embedding Models for Main Electrical Equipment

  • Conference paper
  • First Online:
Advanced Intelligent Computing Technology and Applications (ICIC 2023)

Part of the book series: Lecture Notes in Computer Science ((LNAI,volume 14090))

Included in the following conference series:

  • 1308 Accesses

Abstract

The safety of electric power grids can be threatened by defects in main electrical equipment, creating significant risks and pressures for dispatching operations. To analyze defects in main electrical equipment, we adopt a knowledge graph link prediction approach. We found that using pre-training models, such as BERT, to extract node features and embed initial embeddings significantly improves the effectiveness of knowledge graph embedding models (KGEMs). However, this approach may not always work and could lead to performance degradation. To address this, we propose a transfer learning method that utilizes a small amount of domain-specific electric power corpus to fine-tune the pre-training model. The PCA algorithm is used to reduce the dimensionality of extracted features, thereby lowering the computational cost of KGEMs. Experimental results show that our model effectively improves link prediction performance in analyzing defects in main electrical equipment.

This is a preview of subscription content, log in via an institution to check access.

Access this chapter

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Chapter
JPY 3498
Price includes VAT (Japan)
  • Available as PDF
  • Read on any device
  • Instant download
  • Own it forever
eBook
JPY 13727
Price includes VAT (Japan)
  • Available as EPUB and PDF
  • Read on any device
  • Instant download
  • Own it forever
Softcover Book
JPY 17159
Price includes VAT (Japan)
  • Compact, lightweight edition
  • Dispatched in 3 to 5 business days
  • Free shipping worldwide - see info

Tax calculation will be finalised at checkout

Purchases are for personal use only

Institutional subscriptions

References

  1. Fan, S., Liu, X., Chen, Y., et al.: How to construct a power knowledge graph with dispatching data? Sci. Program. 2020, 1–10 (2020)

    Google Scholar 

  2. Lü, L., Zhou, T.: Link prediction in complex networks: a survey. Physica A 390(6), 1150–1170 (2011)

    Article  Google Scholar 

  3. Wang, Q., Mao, Z., Wang, B., et al.: Knowledge graph embedding: a survey of approaches and applications. IEEE Trans. Knowl. Data Eng. 29(12), 2724–2743 (2017)

    Article  Google Scholar 

  4. Devlin, J., Chang, M.W., Lee, K., et al.: Bert: Pre-Training of Deep Bidirectional Transformers for Language Understanding. arXiv preprint arXiv:1810.04805 (2018)

  5. Liu, Y., Ott, M., Goyal, N., et al.: Roberta: A Robustly Optimized Bert Pretraining Approach. arXiv preprint arXiv:1907.11692 (2019)

  6. Reimers, N., Gurevych, I.: Sentence-bert: Sentence Embeddings Using Siamese Bert-Networks. arXiv preprint arXiv:1908.10084 (2019)

  7. Maćkiewicz, A., Ratajczak, W.: Principal Components Analysis (PCA). Computers & Geosciences 19(3), 303–342 (1993)

    Google Scholar 

  8. Liu, Q., Kusner, M.J., Blunsom, P.: A Survey on Contextual Embeddings. arXiv preprint arXiv:2003.07278 (2020)

  9. Liu, W., Zhou, P., Zhao, Z., et al.: K-bert: Enabling language representation with knowledge graph. Proceedings of the AAAI Conference on Artificial Intelligence 34(03), 2901–2908 (2020)

    Google Scholar 

  10. Yao, L., Mao, C., Luo, Y.: KG-BERT: BERT for Knowledge Graph Completion. arXiv preprint arXiv:1909.03193 (2019)

  11. Li, D., Yi, M., He, Y.: Lp-bert: Multi-Task Pre-Training Knowledge Graph Bert for Link Prediction. arXiv preprint arXiv:2201.04843 (2022)

  12. Gururangan, S., Marasović, A., Swayamdipta, S., et al.: Don’t Stop Pretraining: Adapt Language Models to Domains and Tasks. arXiv preprint arXiv:2004.10964 (2020)

  13. Sushil, M., Suster, S., Daelemans, W.: Are we there yet? exploring clinical domain knowledge of BERT models. In: Proceedings of the 20th Workshop on Biomedical Language Processing, pp. 41–53 (2021)

    Google Scholar 

  14. Gruber, T.R.: Toward principles for the design of ontologies used for knowledge sharing? Int. J. Hum. Comput. Stud. 43(5–6), 907–928 (1995)

    Article  Google Scholar 

  15. Ali, M., Berrendorf, M., Hoyt, C.T., et al.: PyKEEN 1.0: a python library for training and evaluating knowledge graph embeddings. The Journal of Machine Learning Res. 22(1), 3723–3728 (2021)

    Google Scholar 

  16. Wolf, T., Debut, L., Sanh, V., et al.: Transformers: State-of-the-art natural language processing. In: Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing: System Demonstrations, pp. 38–45 (2020)

    Google Scholar 

  17. Ji, S., Pan, S., Cambria, E., et al.: A survey on knowledge graphs: representation, acquisition, and applications. IEEE trans. Neural Networks Learning Syst. 33(2), 494–514 (2021)

    Article  MathSciNet  Google Scholar 

  18. Wang, Z., Zhang, J., Feng, J., et al.: Knowledge graph embedding by translating on hyperplanes. In: Proceedings of the AAAI Conference on Artificial Intelligence 28(1) (2014)

    Google Scholar 

  19. Yang, B., Yih, W., He, X., et al.: Embedding Entities and Relations for Learning and Inference in Knowledge Bases. arXiv preprint arXiv:1412.6575 (2014)

  20. Bordes, A., Usunier, N., Garcia-Duran, A., et al.: Translating embeddings for modeling multi-relational data. Adv. Neural Information Processing Syst. 26 (2013)

    Google Scholar 

  21. Suchanek, F.M., Kasneci, G., Weikum, G.: Yago: a large ontology from wikipedia and wordnet. J. Web Semantics 6(3), 203–217 (2008)

    Article  Google Scholar 

Download references

Acknowledgment

This work is supported by Major Program of Xiamen (3502Z20231006); National Natural Science Foundation of China (62176227,U2066213); Fundamental Research Funds for the Central Universities (20720210047).

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Zhihong Zhang .

Editor information

Editors and Affiliations

Rights and permissions

Reprints and permissions

Copyright information

© 2023 The Author(s), under exclusive license to Springer Nature Singapore Pte Ltd.

About this paper

Check for updates. Verify currency and authenticity via CrossMark

Cite this paper

Chen, Y. et al. (2023). A Novel Approach to Analyzing Defects: Enhancing Knowledge Graph Embedding Models for Main Electrical Equipment. In: Huang, DS., Premaratne, P., Jin, B., Qu, B., Jo, KH., Hussain, A. (eds) Advanced Intelligent Computing Technology and Applications. ICIC 2023. Lecture Notes in Computer Science(), vol 14090. Springer, Singapore. https://doi.org/10.1007/978-981-99-4761-4_60

Download citation

  • DOI: https://doi.org/10.1007/978-981-99-4761-4_60

  • Published:

  • Publisher Name: Springer, Singapore

  • Print ISBN: 978-981-99-4760-7

  • Online ISBN: 978-981-99-4761-4

  • eBook Packages: Computer ScienceComputer Science (R0)

Publish with us

Policies and ethics