Abstract:
Knowledge representation learning (KRL) is one of the important research topics in artificial intelligence and Natural language processing. It can efficiently calculate t...Show MoreMetadata
Abstract:
Knowledge representation learning (KRL) is one of the important research topics in artificial intelligence and Natural language processing. It can efficiently calculate the semantics of entities and relations in a low-dimensional space, and effectively solve the problem of data sparsity, which significantly improve the performance of knowledge acquisition, fusion and reasoning and so on. Starting from the five perspectives of distance-based, semantic matching, bilinear-based, neural network model and additional information model, this paper first introduces the overall framework and specific model design, and then correspondingly introduces the experimental evaluation tasks, metrics and benchmark datasets of each model. On this basis, how to apply KRL to various downstream tasks is summarized.
Date of Conference: 27-30 July 2020
Date Added to IEEE Xplore: 21 August 2020
ISBN Information:
References is not available for this document.
Select All
1.
A. Bordes, J. Weston, R. Collobert, and Y. Bengio, “Learning structured embeddings of knowledge bases,” Proc. Natl. Conf. Artif. Intell., vol. 1, no. Bengio, pp. 301–306, 2011.
2.
A. Bordes, X. Glorot, J. Weston, and Y. Bengio, “Joint learning of words and meaning representations for open-text semantic parsing,” J. Mach. Learn. Res., vol. 22, pp. 127–135, 2012.
3.
A. Bordes, N. Usunier, A. Garcia-Duran, J. Weston, and O. Yakhnenko, “Translating embeddings for modeling multi-relational data,” in Advances in neural information processing systems, pp. 2787–2795, 2013.
4.
Z. Wang, J. Zhang, J. Feng, and Z. Chen, “Knowledge graph embedding by translating on hyperplanes,” Proc. Natl. Conf. Artif. Intell., vol. 2, pp. 1112–1119, 2014.
5.
Y. Lin, Z. Liu, M. Sun, Y. Liu, and X. Zhu, “Learning entity and relation embeddings for knowledge graph completion,” Proc. Natl. Conf. Artif. Intell., vol. 3, pp. 2181–2187, 2015.
6.
Q. Xie, X. Ma, Z. Dai, and E. Hovy, “An interpretable knowledge transfer model for knowledge base completion,” ACL 2017 - 55th Annu. Meet. Assoc. Comput. Linguist. Proc. Conf. (Long Pap., vol. 1, pp. 950–962, 2017.
7.
G. Ji, S. He, L. Xu, K. Liu, and J. Zhao, “Knowledge graph embedding via dynamic mapping matrix,” ACL-IJCNLP 2015 - 53rd Annu. Meet. Assoc. Comput. Linguist. 7th Int. Jt. Conf. Nat. Lang. Process. Asian Fed. Nat. Lang. Process. Proc. Conf., vol. 1, pp. 687–696, 2015.
8.
G. Ji, K. Liu, S. He, and J. Zhao, “Knowledge graph completion with adaptive sparse transfer matrix,” 30th AAAI Conf. Artif. Intell. AAAI 2016, pp. 985–991, 2016.
9.
W. Qian, C. Fu, Y. Zhu, D. Cai, and X. He, “Translating embeddings for Knowledge graph completion with relation attention mechanism,” IJCAI Int. Jt. Conf. Artif. Intell., vol. 2018-July, pp. 4286–4292, 2018.
10.
H. Xiao, M. Huang, Y. Hao, and X. Zhu, “TransA: An Adaptive Approach for Knowledge Graph Embedding,” arXiv preprint arXiv:1509.05490, 2015.
11.
J. Feng, M. Huang, M. Wang, M. Zhou, Y. Hao, and X. Zhu, “Knowledge graph embedding by flexible translation,” Proc. Int. Work. Tempor. Represent. Reason., no. Kr, pp. 557–560, 2016.
12.
S. Yang, J. Tian, H. Zhang, J. Yan, H. He, and Y. Jin, “TransmS: Knowledge graph embedding for complex relations by multidirectional semantics,” IJCAI Int. Jt. Conf. Artif. Intell., vol. 2019-Augus, pp. 1935–1942, 2019.
13.
H. Xiao, M. Huang, H. Yu, and X. Zhu, “TransG: A Generative Mixture Model for Knowledge Graph Embedding,” Proc. ACL, pp. 2316–2325, 2016.
14.
S. He, K. Liu, G. Ji, and J. Zhao, “Learning to represent knowledge graphs with Gaussian embedding,” Int. Conf. Inf. Knowl. Manag. Proc., vol. 19-23-Oct-, pp. 623–632, 2015.
15.
Z. Sun, Z. H. Deng, J. Y. Nie, and J. Tang, “Rotate: Knowledge graph embedding by relational rotation in complex space,” 7th Int. Conf. Learn. Represent. ICLR 2019, pp. 1–18, 2019.
16.
T. Ebisu and R. Ichise, “TorusE: Knowledge graph embedding on a lie group,” 32nd AAAI Conf. Artif. Intell. AAAI 2018, pp. 1819–1826, 2018.
17.
A. Bordes, X. Glorot, J. Weston, and Y. Bengio, “A semantic matching energy function for learning with multi-relational data: Application to word-sense disambiguation,” Mach. Learn., vol. 94, no. 2, pp. 233–259, 2014.
18.
M. Nickel, V. Tresp, and H. P. Kriegel, “A three-way model for collective learning on multi-relational data,” Proc. 28th Int. Conf. Mach. Learn. ICML 2011, no. January 2011, pp. 809–816, 2011.
19.
A. García-Durán, A. Bordes, and N. Usunier, “Effective blending of two and three-way interactions for modeling multi-relational data,” Lect. Notes Comput. Sci. (including Subser. Lect. Notes Artif. Intell. Lect. Notes Bioinformatics), vol. 8724 LNAI, no. PART 1, pp. 434–449, 2014.
20.
B. Yang, W. tau Yih, X. He, J. Gao, and L. Deng, “Embedding entities and relations for learning and inference in knowledge bases,” 3rd Int. Conf. Learn. Represent. ICLR 2015 - Conf. Track Proc., pp. 1–12, 2015.
21.
T. Trouillon, J. Welbl, S. Riedel, E. Ciaussier, and G. Bouchard, “Complex embeddings for simple link prediction,” 33rd Int. Conf. Mach. Learn. ICML 2016, vol. 5, pp. 3021–3032, 2016.
22.
M. Nickel, L. Rosasco, and T. Poggio, “Holographic embeddings of knowledge graphs,” 30th AAAI Conf. Artif. Intell. AAAI 2016, no. October, pp. 1955–1961, 2016.
23.
S. Zhang, Y. Tay, L. Yao, and Q. Liu, “Quaternion Knowledge Graph Embeddings,” Advances in Neural Information Processing Systems.no. 1, pp. 1–14, 2019.
24.
H. Liu, Y. Wu, and Y. Yang, “Analogical inference for multi-relational embeddings,” 34th Int. Conf. Mach. Learn. ICML 2017, vol. 5, pp. 3422–3432, 2017.
25.
R. Socher, D. Chen, C. D. Manning, and A. Y. Ng, “Reasoning with neural tensor networks for knowledge base completion,” Advances in neural information processing systems, pp. 926–934, 2013.
26.
Q. Liu et al., “Probabilistic Reasoning via Deep Learning: Neural Association Models,” arXiv preprint arXiv:1603.07704, 2016.
27.
P. Sharma, H. S. Pai, G. S. Pai, M. Kuruvila, and R. Kolar, “A latent factor model for highly multi-relational data,” Advances in neural information processing systems, pp. 3167–3175, 2012.
28.
S. M. Kazemi and D. Poole, “Simple embedding for link prediction in knowledge graphs,” Adv. Neural Inf. Process. Syst., vol. 2018-Decem, no. Nips, pp. 4284–4295, 2018.
29.
Y. Wang, R. Gemulla, and H. Li, “On multi-relational link prediction with bilinear models,” 32nd AAAI Conf. Artif. Intell. AAAI 2018, pp. 4227–4234, 2018.
30.
T. Dettmers, P. Minervini, P. Stenetorp, and S. Riedel, “Convolutional 2D knowledge graph embeddings,” 32nd AAAI Conf. Artif. Intell. AAAI 2018, pp. 1811–1818, 2018.