arXiv Analytics

Sign in

arXiv:2008.12813 [cs.CL]AbstractReferencesReviewsResources

HittER: Hierarchical Transformers for Knowledge Graph Embeddings

Sanxing Chen, Xiaodong Liu, Jianfeng Gao, Jian Jiao, Ruofei Zhang, Yangfeng Ji

Published 2020-08-28Version 1

This paper examines the challenging problem of learning representations of entities and relations in a complex multi-relational knowledge graph. We propose HittER, a Hierarchical Transformer model to jointly learn Entity-relation composition and Relational contextualization based on a source entity's neighborhood. Our proposed model consists of two different Transformer blocks: the bottom block extracts features of each entity-relation pair in the local neighborhood of the source entity and the top block aggregates the relational information from the outputs of the bottom block. We further design a masked entity prediction task to balance information from the relational context and the source entity itself. Evaluated on the task of link prediction, our approach achieves new state-of-the-art results on two standard benchmark datasets FB15K-237 and WN18RR.

Related articles: Most relevant | Search more
arXiv:1909.08402 [cs.CL] (Published 2019-09-18)
Enriching BERT with Knowledge Graph Embeddings for Document Classification
arXiv:2009.12517 [cs.CL] (Published 2020-09-26)
QuatRE: Relation-Aware Quaternions for Knowledge Graph Embeddings
arXiv:2401.07977 [cs.CL] (Published 2024-01-15, updated 2024-09-27)
Towards Efficient Methods in Medical Question Answering using Knowledge Graph Embeddings