Triple Crossover: Alibaba’s CrossE Improves Knowledge Graph Embedding

This article is part of the Academic Alibaba series and is taken from the WSDM 2019 paper entitled “Interaction Embeddings for Prediction and Explanation in Knowledge Graphs” by Wen Zhang, Bibek Paudel, Wei Zhang, Abraham Bernstein, and Huajun Chen. The full paper can be read here.

In the background of many common information retrieval tasks like the everyday web search, a kind of tool for organizing information called a knowledge graph is constantly working to make a broader range of known entities discoverable. Comprised of semantic triples of any two such entities and a defined relation between them, these graphs provide a single destination for locating data and its associated meanings with search terms based on natural human language.

One major focus for knowledge graphs’ developers is improving their ability to autonomously infer new triples to make knowledge graphs more complete. An example would be inferring the unknown entity in the following triple:

[David] — [is the father of] — [?]

To do so, systems must work to understand other established entities and connections that can shed light on the problem, such as triples describing David’s marital relations or the parent-child relationships of any of his spouses. Unfortunately, previous models have relied heavily on external information like text corpuses. To truly empower a question answering or recommender system, for instance, what is needed is a knowledge graph that can learn from existing triples without additional help.

Now, by focusing on the underexplored area of crossover interactions, researchers at Alibaba, Zhejiang University and the University of Zurich have advanced a knowledge graph embedding (KGE) model called CrossE that achieves this by learning multiple triple-specific embeddings for each graph entity. As well as achieving state-of-the-art triple prediction, the model proved able to provide reliable explanations for its choices in tests with complex datasets.

Understanding Crossover Interactions

In knowledge graph triples, crossover interactions are effects passing either from entities to relations or from relations to entities. Importantly, these effects are bi-directional, in that they always impact both entities and relations at the same time — a fact which previous work has generally overlooked.

A hypothetical knowledge graph, where nodes and edges represent entities and relations

As an example, take the dashed line in the image above, which represents the prediction task “[X] — [isFatherOf] — [?]”. Note that the nature of the relation “isFatherOf” (i.e. a family connection) affects the relevance of triples about entity X during prediction: triples shown in red connect relevant entities, while those shown in blue connect irrelevant entities. Meanwhile, the path that links entity X to entity M affects the way in which the relation “isFatherOf” is predicted. Thus, the existing path “[X] — [hasWife] — [Z]; [Z] — [hasChild] — [M]” is selected for prediction instead of the non-existent path “[M] — [fatherIs] — [X]”.

Earlier models have attempted to capture the information these effects describe using general embeddings or multiple separately learned embeddings. To improve on these, CrossE’s researchers developed a novel KGE that not only learns one general embedding for each entity and relation but also generates multiple triple-specific interaction embeddings through a relation-specific interaction matrix.

Overview of CrossE; the interaction embeddings in the unshaded boxes are derived through crossover interactions from general embeddings E and R and interaction matrix C

Additionally, the model uses an embedding-based path-searching algorithm to provide explanations for the KGE’s predictions, building on previous work to improve the balance between accuracy and understandability of results in areas such as recommender systems.

Measuring Predictions: Experimental Results

To evaluate CrossE, researchers applied the popular WN18, FB15k, and FB15k-237 datasets in a link prediction task, followed by a test of the model’s ability to generate explanations for predicted triples.

Compared with seven baseline models, CrossE achieved similar results with the WN18 data in some metrics while outperforming them in others. With the more challenging FB15k and FB15k237 data, however, CrossE demonstrated state-of-the-art performance, indicating that the consideration of crossover interactions improves its ability to encode diverse relations in knowledge graphs. Simultaneously, it outperformed baselines’ ability to provide supported explanations for their predictions.

The full paper can be read here.

Alibaba Tech

First hand and in-depth information about Alibaba’s latest technology → Facebook: “Alibaba Tech”. Twitter: “AlibabaTech”.

--

--

--

First-hand & in-depth information about Alibaba's tech innovation in Artificial Intelligence, Big Data & Computer Engineering. Follow us on Facebook!

Love podcasts or audiobooks? Learn on the go with our new app.

Recommended from Medium

Sentiment analysis in less than 50 lines of Python

photo by books

Benchmark M1 (part 2) vs 20 cores Xeon vs AMD EPYC, 16 and 32 cores

Intro to Machine Learning on iOS with CreateML & TuriCreate - Part 2

Here’s one way to teach an introductory class to NLP

Standardization and normalization — An approach to achieve a better performing ML model

The Perceptron Algorithm: How it Works and Why it Works

What is the difference between training and test dataset?

Best Production Machine Learning Tools & Tech Stack

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store
Alibaba Tech

Alibaba Tech

First-hand & in-depth information about Alibaba's tech innovation in Artificial Intelligence, Big Data & Computer Engineering. Follow us on Facebook!

More from Medium

Build Recommendation Systems with PyTorch Geometric and ArangoDB

EP.5 | Connect to Kubeflow 1.5 Pipelines from outside your cluster

Google Cloud Platform: N-BEATS Component

Unlocking the full potential of data