arXiv Analytics

Sign in

arXiv:1902.09492 [cs.CL]AbstractReferencesReviewsResources

Cross-Lingual Alignment of Contextual Word Embeddings, with Applications to Zero-shot Dependency Parsing

Tal Schuster, Ori Ram, Regina Barzilay, Amir Globerson

Published 2019-02-25Version 1

We introduce a novel method for multilingual transfer that utilizes deep contextual embeddings, pretrained in an unsupervised fashion. While contextual embeddings have been shown to yield richer representations of meaning compared to their static counterparts, aligning them poses a challenge due to their dynamic nature. To this end, we construct context-independent variants of the original monolingual spaces and utilize their mapping to derive an alignment for the context-dependent spaces. This mapping readily supports processing of a target language, improving transfer by context-aware embeddings. Our experimental results demonstrate the effectiveness of this approach for zero-shot and few-shot learning of dependency parsing. Specifically, our method consistently outperforms the previous state-of-the-art on 6 target languages, yielding an improvement of 6.8 LAS points on average.

Related articles: Most relevant | Search more
arXiv:2411.12074 [cs.CL] (Published 2024-11-18)
Mitigating Gender Bias in Contextual Word Embeddings
arXiv:2004.08371 [cs.CL] (Published 2020-04-17)
Exploring the Combination of Contextual Word Embeddings and Knowledge Graph Embeddings
arXiv:2406.13229 [cs.CL] (Published 2024-06-19)
Probing the Emergence of Cross-lingual Alignment during LLM Training