arXiv Analytics

Sign in

arXiv:1608.02076 [cs.CL]AbstractReferencesReviewsResources

Bi-directional Attention with Agreement for Dependency Parsing

Hao Cheng, Hao Fang, Xiaodong He, Jianfeng Gao, Li Deng

Published 2016-08-06Version 1

We develop a novel bi-directional attention model for dependency parsing, which learns to agree on headword predictions from the forward and backward parsing directions. The parsing procedure for each direction is formulated as sequentially querying the memory component that stores continuous headword embeddings. The proposed parser makes use of soft headword embeddings, allowing the model to implicitly capture high-order parsing history without dramatically increasing the computational complexity. We conduct experiments on English, Chinese, and 12 other languages from the CoNLL 2006 shared task, showing that the proposed model achieves state-of-the-art unlabeled attachment scores on 7 languages.

Related articles: Most relevant | Search more
arXiv:1606.01280 [cs.CL] (Published 2016-06-03)
Dependency Parsing as Head Selection
arXiv:1805.01087 [cs.CL] (Published 2018-05-03)
Stack-Pointer Networks for Dependency Parsing
arXiv:1805.05202 [cs.CL] (Published 2018-05-14)
A Dynamic Oracle for Linear-Time 2-Planar Dependency Parsing