I. Introduction
In the past decade, Knowledge Graph (KG) has motivated many knowledge-driven applications, such as question answering and data integration. DBpedia [1], Freebase [2] and YAGO3 [3] are well known examples in KG. They store knowledge in triples that contain two entities and their relationship (for example, in triple < Tokyo, captionOf, Japan >, Tokyo and Japan are entities, captionOf is their relationship). Representation learning for modeling KG is an important task, which is the backbone of some downstream tasks like link prediction and entity classification. KG representation learning is also known as KG Embedding. It aims to encode KG structures into low-dimensional embeddings. These quantified embeddings can capture global patterns (also called structure-based information) and make computing the likeliness of given triples possible. These open KG and learned embeddings could benefit collaborative computing works.
Knowledge graph with numeric literals. Literals are linked to entities through attributes. These attributes are not only different, they also contain semantic meanings.