Global Neural CCG Parsing with Optimality Guarantees
EMNLP• 2016
Abstract
We introduce the first global recursive neural parsing model with optimality
guarantees during decoding. To support global features, we give up dynamic
programs and instead search directly in the space of all possible subtrees.
Although this space is exponentially large in the sentence length, we show it
is possible to learn an efficient A* parser. We augment existing parsing
models, which have informative bounds on the outside score, with a global model
that has loose bounds but only needs to model non-local phenomena. The global
model is trained with a new objective that encourages the parser to explore a
tiny fraction of the search space. The approach is applied to CCG parsing,
improving state-of-the-art accuracy by 0.4 F1. The parser finds the optimal
parse for 99.9% of held-out sentences, exploring on average only 190 subtrees.