dependency grammar



Dependency grammars are a kind of formal grammars based on the notion of dependency graph, rather than constituency trees. The first mathematical definitions are from Hays (1960) and Gaifman (1965).


A dependency grammar is a tuple G=(V,X,R,s)G = (V, X, R, s) where VV and XX are finite sets of terminal and non-terminal symbols with the start symbol sXs \in X and a finite set of rules R 1 2 3R \subseteq \mathcal{R}_1 \cup \mathcal{R}_2 \cup \mathcal{R}_3 which have one of the following shapes:

  1. 1=X ×X×X \mathcal{R}_1 = X^\star \times X \times X^\star. The interpretation of a rule r=(u,x,v) 1r = (u, x, v) \in \mathcal{R}_1 is that a word of type xXx \in X can govern words of type uX u \in X^\star on its left and vX v \in X^\star on its right. We say that uu and vv are the left and right dependencies of the head of type xx.
  2. 2=V×X\mathcal{R}_2 = V \times X. The interpretation of a rule (w,x) 2(w, x) \in \mathcal{R}_2 is that the word wVw \in V can be given type xXx \in X. These correspond to the dictionary entries in a categorial grammar.
  3. 3=X×{s}\mathcal{R}_3 = X \times \{ s \}. The interpretation of a rule r=(x,s) 3r = (x, s) \in \mathcal{R}_3 is that a word of type xXx \in X can be the head of a grammatical sentence.

The language L(G)V L(G) \subseteq V^\star of a dependency grammar GG is defined as follows. A string w 1w nV w_1 \dots w_n \in V^\star is grammatical if there are types x 1x nx_1 \dots x_n, a binary relation d[n]×[n]d \subseteq [n] \times [n] for [n]=1,,n[n] = {1, \dots, n} with reflexive transitive closure d d^\star such that:

Drawing conventions

The conventions on how to depict such a dependency relation vary:

conventions vary

As string diagrams

We can reformulate the original definition in terms of the free rigid monoidal category generated by the symbols V+XV + X as objects and the rules RR as arrows. Then the convention (e.) where dependencies are represented as arcs is precisely a string diagram with the terminal symbols as domain and the start symbol ss as codomain.

Expressive power

Dependency grammars have equivalent expressive power to that of context free grammar and pregroup grammar.

We can translate a dependency rule r=(u,x,v)Rr = (u, x, v) \in R into a set of production rules {xuwv|(w,x)R}\{x \to u w v \vert (w, x) \in R\} for a context free grammar. Dependency grammars are precisely the context-free grammars with exactly one terminal symbol in the right hand-side of each production rule.

We can also translate a dependency rule r=(u,x,v)Rr = (u, x, v) \in R into a set of dictionary entries {wu rxv l|(w,x)R}\{w \to u^r x v^l \vert (w, x) \in R\} for a pregroup grammar.

However going in the other direction, i.e. from pregroup or context free to dependency grammar, the equivalence is weak: it preserves the set of grammatical strings but not their grammatical structure.