arXiv Analytics

Sign in

arXiv:1310.5042 [cs.LG]AbstractReferencesReviewsResources

Distributional semantics beyond words: Supervised learning of analogy and paraphrase

Peter D. Turney

Published 2013-10-18Version 1

There have been several efforts to extend distributional semantics beyond individual words, to measure the similarity of word pairs, phrases, and sentences (briefly, tuples; ordered sets of words, contiguous or noncontiguous). One way to extend beyond words is to compare two tuples using a function that combines pairwise similarities between the component words in the tuples. A strength of this approach is that it works with both relational similarity (analogy) and compositional similarity (paraphrase). However, past work required hand-coding the combination function for different tasks. The main contribution of this paper is that combination functions are generated by supervised learning. We achieve state-of-the-art results in measuring relational similarity between word pairs (SAT analogies and SemEval~2012 Task 2) and measuring compositional similarity between noun-modifier phrases and unigrams (multiple-choice paraphrase questions).

Journal: Transactions of the Association for Computational Linguistics (TACL), (2013), 1, 353-366
Categories: cs.LG, cs.AI, cs.CL, cs.IR
Subjects: H.3.1, I.2.6, I.2.7
Related articles: Most relevant | Search more
arXiv:2002.03555 [cs.LG] (Published 2020-02-10)
Supervised Learning: No Loss No Cry
arXiv:cs/0508053 [cs.LG] (Published 2005-08-10)
Measuring Semantic Similarity by Latent Relational Analysis
arXiv:1610.02413 [cs.LG] (Published 2016-10-07)
Equality of Opportunity in Supervised Learning