arXiv Analytics

Sign in

arXiv:2004.08371 [cs.CL]AbstractReferencesReviewsResources

Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings

Lea Dieudonat, Kelvin Han, Phyllicia Leavitt, Esteban Marquer

Published 2020-04-17Version 1

``Classical'' word embeddings, such as Word2Vec, have been shown to capture the semantics of words based on their distributional properties. However, their ability to represent the different meanings that a word may have is limited. Such approaches also do not explicitly encode relations between entities, as denoted by words. Embeddings of knowledge bases (KB) capture the explicit relations between entities denoted by words, but are not able to directly capture the syntagmatic properties of these words. To our knowledge, recent research have focused on representation learning that augment the strengths of one with the other. In this work, we begin exploring another approach using contextual and KB embeddings jointly at the same level and propose two tasks -- an entity typing and a relation typing task -- that evaluate the performance of contextual and KB embeddings. We also evaluated a concatenated model of contextual and KB embeddings with these two tasks, and obtain conclusive results on the first task. We hope our work may contribute as a basis for models and datasets that develop in the direction of this approach.

Comments: pre-publication, 16 pages, 4 figures
Categories: cs.CL, cs.LG
Related articles: Most relevant | Search more
arXiv:2008.12813 [cs.CL] (Published 2020-08-28)
HittER: Hierarchical Transformers for Knowledge Graph Embeddings
arXiv:2411.12074 [cs.CL] (Published 2024-11-18)
Mitigating Gender Bias in Contextual Word Embeddings
arXiv:1902.09492 [cs.CL] (Published 2019-02-25)
Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing