Stanford CS224n Natural Language Processing Course5

Course 5 - Dependency Parsing

Syntactic Structure: Consistency and Dependency

1. Two views of linguistic structure: Constituency = phrase stucture grammar = context-free grammars (CFGs)

Phrase structure organizes words into nested constituents

Dependency Grammar and Treebanks

Dependency structure shows which words depend on which other words.

Dependency syntax postulates that syntactic structure consists of relations between lexical items, normally binary asymmetric relations(“arrows”) called dependencies

The arrows are commonly typed with the name of grammatical relations(subject, prepositional object, apposition, etc.)

Usually, dependencies form a tree(connected, acyclic, single-head)

Transition-based dependency parsing

A simple form of greedy discriminative dependency parser.

2005 MaltParser, Nivre and Hall 2005

Each action is predicted by a discriminative classifier over each legal move(like softmax classifier)

  • Max of 3 untyped choices, max of |R| * 2 + 1 when typed
  • Features: top of stack word, POS; first in the buffer word, POS

It provides very fast linear time parsing with fractionally lower accuracy

Evaluation of Dependency Parsing: (labeled) dependency accuracy

image-20200205111717409.png

Neural dependency parsing

Why train a neural dependency parsing?

Problem #1: sparse

Problem #2: incomplete

Problem #3: expensive computation(95%+ of parsing time is consumed by feature computation)

2014, chen and manning

English parsing to Stanford Dependencies:

  • Unlabeled attachment score = head
  • Labeled attachment score = head and label

Google

bigger, deeper networks + beam search + CRF-style inference

Graph-based neural dependency parsing

over 95%