0% found this document useful (0 votes)
213 views16 pages

An Introduction To Dependency Grammar

This document provides an introduction to dependency grammar (DG). It begins by outlining some basic concepts of DG, including the intuition behind DG as analyzing sentences based on word dependencies. It then discusses Hays and Gaifman's formulation of DG and how it compares to phrase structure grammars. Finally, it describes how later formulations of DG separate dependencies from surface word order to address issues of non-projectivity. The document aims to explain the benefits of DG and argue that it can adequately analyze languages with relatively free word order.

Uploaded by

rishabh295
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
213 views16 pages

An Introduction To Dependency Grammar

This document provides an introduction to dependency grammar (DG). It begins by outlining some basic concepts of DG, including the intuition behind DG as analyzing sentences based on word dependencies. It then discusses Hays and Gaifman's formulation of DG and how it compares to phrase structure grammars. Finally, it describes how later formulations of DG separate dependencies from surface word order to address issues of non-projectivity. The document aims to explain the benefits of DG and argue that it can adequately analyze languages with relatively free word order.

Uploaded by

rishabh295
Copyright
© Attribution Non-Commercial (BY-NC)
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

Hausarbeit fr das Hauptseminar Dependenzgrammatik SoSe 99 u

An Introduction to Dependency Grammar


Ralph Debusmann Universitt des Saarlandes a Computerlinguistik [email protected] January 2000

Contents
1 Introduction 2 Basic Concepts 2.1 The Intuition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.2 Robinsons Axioms . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2.3 The Dependency Relation . . . . . . . . . . . . . . . . . . . . . . . . . . . 3 Dependency Grammar and PSG 3.1 The Hays and Gaifman DG . . . . . . . . . . . . . . . . . . . . . . . . . . 3.2 DG Set Against CFG . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3.3 The Issue Of Projectivity . . . . . . . . . . . . . . . . . . . . . . . . . . . 2 2 2 3 4 5 5 6 8

4 Separating Dependency Relations And Surface Order 10 4.1 A non-projective Dependency Grammar . . . . . . . . . . . . . . . . . . . 10 4.2 Non-projective Dependency Trees . . . . . . . . . . . . . . . . . . . . . . . 10 4.3 Separate Specication of Word Order . . . . . . . . . . . . . . . . . . . . . 12 5 Dependency Grammar Formalisms 6 Conclusion 12 13

Introduction

Many linguists consider Dependency Grammar (DG) to be inferior to established phrase structure based theories like GB (Chomsky 1986), LFG (Kaplan & Bresnan 1982) and HPSG (Pollard & Sag 1994). The aim of this article is to remedy this state of aairs by seeking to make those unconvinced of DG perceive the benets it oers. To this end, section 2 makes the reader acquainted with the basic concepts of DG, before section 3 sets the theory against phrase structure based theories, arguing that it has considerable advantages in the analysis of languages with relatively free word order (e.g. German, Finnish, Japanese, Korean, Latin, Russian ...). Section 4 describes Duchiers (1999) DG axiomatization as a prototypical example of a DG that separates dependencies and surface order. Thereafter, section 5 proceeds with an overview of current Dependency Grammar formalisms and section 6 rounds the paper up.

Basic Concepts

This section serves to make those interested in Dependency Grammar aquainted with its basic concepts. It starts by illustrating the intuition behind DG.

2.1

The Intuition

Modern Dependency Grammar has been created by the French linguist Lucien Tesni`re e (1959), but as Covington (1990) argues, DG has already been used by traditional grammarians since the Middle Ages. The observation which drives DG is a simple one: In a sentence, all but one word depend on other words. The one word that does not depend on any other is called the root 1 of the sentence. A typical DG analysis of the sentence A man sleeps is demonstrated below in (1): (1) a depends on man man depends on sleep sleep depends on nothing (i.e. is the root of the sentence) or, put dierently a modies man man is the subject of sleep sleep is the matrix verb of the sentence This is Dependency Grammar.
1

The root is alternatively termed main or central element.

Dependencies are motivated by grammatical function, i.e. both syntactically and semantically. A word depends on another either if it is a complement or a modier of the latter. In most formulations of DG for example, functional heads or governors (e.g. verbs) subcategorize for their complements. Hence, a transitive verb like love requires two complements (dependents), one noun with the grammatical function subject and one with the function object. This notion of subcategorization or valency is similar to HPSGs SUBCAT-list and even closer resembles LFGs functional completeness and coherence criteria. Figure 1 represents the analysis of (1) graphically (dependent lexemes and categories below their heads). In Tesni`res terminology, dependency graphs (or dependency trees) e of this form are called stemmas. The left graph in gure 1 exhibits a real stemma, with its nodes labeled by words, the right one a virtual stemma, its nodes labeled by lexical categories. Nowadays, it is however common practice to use an equivalent representation which collapses both stemmas into one tree (gure 2), and this is also adopted for the rest of this article. Unlike Tesni`res stemma, this representation also includes information e about the surface order of the analyzed string (written under the dividing line from left to right).

sleeps man a

V N Det

Figure 1: Stemma representations of A man sleeps

V N Det a man sleeps

Figure 2: Dependency tree for A man sleeps

2.2

Robinsons Axioms

Loosely based on Tesni`res formulation, Hays (1964) and Gaifman (1965) were the rst e to study the mathematical properties of DG. Their results will be presented in section

3. A couple of years later, Robinson (1970) formulated four axioms to govern the wellformedness of dependency structures, depicted below in (2): (2) 1. One and only one element is independent. 2. All others depend directly on some element. 3. No element depends directly on more than one other. 4. If A depends directly on B and some element C intervenes between them (in the linear order of the string), then C depends directly on A or B or some other intervening element. The rst three of these axioms fairly elegantly capture the essential conditions for the well-formedness of dependency structures, i.e. that they shall be trees. Axioms 1 and 2 state that in each sentence, one and only one element is independent and all others depend directly on some other element. Axiom 3 states that if an element A depends directly on another one B, it must not depend on a third one C. This requirement is often referred to as single-headedness or uniqueness and is assumed in most DG formulations, starting from Tesni`res. e The fourth axiom is often called the requirement of projectivity and (roughly speaking) disallows crossing edges in dependency trees. Tesni`re did not impose this condition, e and not without reason. I will argue in section 3 that if enforced, it deprives Dependency Grammar of its most relevant asset.

2.3

The Dependency Relation

To round up this account of basic DG concepts, I will restate the notion of dependency as a binary relation R, ranging over the elements W of a sentence. A mapping M maps W to the actual words of a sentence (for an example, see gure 3). Now for w1 , w2 W , w1 , w2 R asserts that w1 is dependent on w2 . The properties of R presented are, as Robinsons (1970) axioms, nothing else but treeness constraints on dependency graphs.
Mary loves another Mary w1 w2 w3 w 4

Figure 3: Example mapping M (w1 ...w4 W ) (3) 1. R W W

2. w1 w2 ...wk1 wk W : w1 , w2 R ... wk1 , wk R : w1 = wk (acyclicity) 3. !w1 W : w2 W : w1 , w2 R (rootedness) / 4. w1 w2 w3 W : w1 , w2 R w1 , w3 R w2 = w3 (single-headedness)

From acyclicity follows that R is also asymmetrical (i.e. w1 w2 W : w1 , w2 R w2 , w1 R). The asymmetry of R is accounted for in dependency trees by an / implicitly assumed ordering from top to bottom, i.e. for every two elements connected by a dependency edge, the lower one depends on the upper one. Asymmetry also guarantees that R is irreexive (w1 W : w1 , w1 R). Moreover, observe that condition 4 is / a counterpart to Robinsons single-headedness axiom 3. The projectivity requirement is not reected by any of the conditions in (3).

Dependency Grammar and PSG

This section sets Dependency Grammar against phrase structure based approaches. It begins with a description of a DG formulation developed by Hays (1964) and Gaifman (1965).

3.1

The Hays and Gaifman DG

A couple of years after Tesni`re had dened DG, Hays (1964) and Gaifman (1965) were e the rst to study its mathematical properties. Their aim was to nd a mathematical axiomatization of DG to facilitate development of parsing and generation algorithms, and they came up with a formulation of DG that is still highly inuential among DG theorists (e.g. Lai & Huang 1998, Lombardo & Lesmo 1998). The intuition behind Hays and Gaifmans axiomatization is this: If a dependency relation R (as dened in section 2.3) holds for w1 , x ... wk , x , all wi (i {1...k}) are dependent on x (or alternatively, x governs all wi or is the head of all wi ). Hays and Gaifman use the following rule (plus two special case rules) to capture this notion: (4) 1. x(w1 , ..., , ..., wk ) : w1 ...wk are dependent on x 2. x() : x is a leaf node 3. (x) : x is a sentence root node The star indicates the position of governor x in the linear order of words2 w1 ...wk . Such a DG for the sentence John loves a woman consists of ve rules. (5) and gure 4 depict these and a dependency tree analyzing the sentence respectively. (5) (V ) V (N, , N ) N (Det, ) N () Det() The attentive reader will note the addition of dotted lines connecting the lexical category nodes with the words of the sentence. They shall depict a projection of the lexical categories onto the words of the sentence.
2

Actual Hays and Gaifman DG rules are expressed over categories instead of words.

V N Det John loves a woman N

Figure 4: Hays and Gaifman DG analysis of John loves a woman To conclude the explanation of Hays and Gaifman DGs, (6) shows how such DGs look like in formal terms: (6) DG = R, L, C, F R a set of dependency rules over the auxiliary symbols C L a set of terminal symbols (lexemes) C a set of auxiliary symbols (lexical categories) F an assignment function (F : L C) A Hays and Gaifman DG (HGDG) complies with all of Robinsons axioms (section 2.2) for well-formed dependency structures, and to axiom 4 (projectivity) in particular. Graphically, this condition requires that no projection edge is to be crossed by any dependency edge in HGDG analysis trees.

3.2

DG Set Against CFG

By specifying the mathematical properties of DG, Hays (1964) and Gaifman (1965) did not only aim at easing development of algorithms for DG parsing and generation, but also at being able to formally set DG against Context-Free Grammars (CFGs). Hays (1964) for instance establishes that DGs as dened in section 3.1 are weakly equivalent to CFGs in the sense that: (7) they have the same terminal alphabet; for every string over that alphabet, every structure attributed by either grammar corresponds to a structure attributed by the other.

So both CFG and HGDG are able to produce corresponding analyses for any string, but the analysis structures assigned to these strings are not necessarily the same. Thus as Hays (1964) points out, CFG and HGDG are not strongly equivalent. It is not possible to map any CFG to a corresponding HGDG that attributes equivalent tree structures to any string, simply because HGDG cannot produce nonterminal nodes other than

preterminals. The reverse however is possible. (8) exhibits a proof procedure to map any HGDG to a corresponding CFG attributing equivalent structures to any string analyzed. This procedure has been applied to (9), to generate its corresponding CFG (10). (8) DG = R, L, C, F : A Hays and Gaifman DG consists of a set of dependency rules R, a set of terminal symbols L, a set of nonterminal symbols C and an assigment function F (F : L C). CF G = P, T, N, S : A CFG consists of sets of production rules P , terminal symbols T , nonterminal symbols N and start symbols S. I will now present a proof procedure to map any DG of the kind Hays and Gaifman proposed onto a CFG attributing equivalent analysis trees to any string. To this end, every DG rule must be converted into one or more CFG production rules. Recall that these are the three rule types of a Hays and Gaifman DG: 1. x(w1 , ..., , ..., wk ) : w1 ...wk are dependent on x 2. x() : x is a leaf node 3. (x) : x is a sentence root node Rules of the rst type are converted to CFG rules using the following procedure: Collect all terminal symbols that are of category x in a set X, i.e. X = F 1 (x). Now postulate a CFG production rule for each y X: x w1 ... y ... wk Rules of the second type are a special case of the rst type. For each y X, postulate: xy Rules of the third type are not converted to production rules but make up the set of start symbols S for the corresponding CFG. Hence S is the union of all auxiliary symbols contained in rules of the type (x). Finally, the set of terminal symbols T of the corresponding CFG is equal to the set of terminal symbols L of the Hays and Gaifman DG. The set of nonterminal symbols N is equal to the set of auxiliary symbols C. (9) DG = R, L, C, F R = {(V ), V (N, , N ), N (Det, ), N (), Det()} L = {loves, woman, John, a}

C = {V, N, Det} F (loves) = V F (woman) = N F (John) = N F (a) = Det (10) CF G = P, T, N, S P = {V N loves N, N Det woman, N Det John, N woman, N John, Det a} T =L N =C S = {V } As can be seen from (8), a CFG converted from a HGDG must have one and only one terminal symbol on the right hand side of any production rule. This resembles the Greibach Normal Form, where each production rule is of the form A a x1 ... xn , where a must be a terminal symbol and all x1 nonterminal symbols. Greibach Normal Forms are, as HGDG, weakly equivalent to CFG. Figure 5 now depicts an analysis of John loves a woman using the CFG given in (10). As can be seen, the analysis tree of the same sentence using a converted CFG is equivalent to the dependency tree from gure 4, except that the dotted projection edges are replaced by solid edges from preterminal to terminal nodes.

V N Det John loves a woman N

Figure 5: CFG analysis tree using (10) Using this result, it is easy to see that the projectivity condition must hold for all analyses of a Hays and Gaifman DG: (11) Violating the projectivity condition would require crossing edges in dependency trees resulting from such a DG. But as every DG can be mapped 1:1 to a CFG that generates exactly the same structures, and because in CFG analysis trees, crossing edges can not occur, crossing edges can also not occur in dependency trees resulting from the analysis of a HGDG. Hence projectivity is a consequence of the denition of the HGDG.

3.3

The Issue Of Projectivity

In the previous section, an examination of Hays and Gaifmans axiomatization of DG has lead to the conclusion that such a DG is merely a notational variant of CFG. So why not simply stop here and quit doing any research on dependency grammars? Because DGs of the form Hays and Gaifman proposed do not represent Dependency Grammar in general. If they did, i.e. if dependency trees in general had to be projective, perfectly sensible DG analyses had to be abandoned, especially when analyzing languages with a high degree of word order variation like Latin, Russian or German. Consider for instance the analysis of the Latin sentence ultima Cumaei venit iam carminis aetas, taken from Covington (1990). Figure 6 displays a perfectly intuitive analysis of the sentence in terms of dependency, but if one complied to the condition of projectivity, it had to be abandoned and other, probably less intuitive solutions seeked.

V Adv Adj Adj N

ultima Cumaei venit iam carminis aetas nom. gen. gen. nom. epoch last Cumean has- now song come

Figure 6: Non-projective DG analysis of The last epoch of the Cumean Song has now arrived. in Latin (Vergil, Eclogues IV.4) If the latter sounds familiar to the reader, it is because this is precisely what PSG theorists do when confronted with discontinuous constituents or other word order variation phenomena. In generative syntax for instance, one solution are scrambling rules (e.g. Ross 1967). But at least insofar as computational linguistics is concerned, these rules are like any other rules involving transformations they become unfeasible if one proceeds to implement ecient parsing algorithms. Another solution for treating discontinuous constituents in a phrase structure based framework has been pioneered (and then rejected) by Uszkoreit (1986, 1987), in the paradigm of ID/LP formalisms like GPSG (Gazdar, Klein, Pullum & Sag 1985). He proposed to replace the transformationalists scrambling rules by attening rules that discard constituent structure and make most words hang directly from the S node. The problem with this approach is that a at structure is no structure or tree at all it only claims that the words below the S -node form a sentence. Imagine how hard a task it would constitute to construct a semantics out of such a syntactic description. A recent proposal on treating discontinuous constituents in HPSG is due to Mller u (e.g. Mller 1999) and makes use of word order domains to describe extraposition and u

permutability of constituents in German. Constraining analysis trees to binary branching ones, Mller (1999) utilizes the shue-Relation (see Reape 1994) to allow for disconu tinuous constituents. Mllers (1999) proposal does succeed in providing a treatment u of German scrambling phenomena within the HPSG framework, but since in HPSG, word order is not totally separable from other syntactic (functional) considerations, his analyses lack the elegance of non-projective dependency analyses like the one exhibited in Figure 6. So none of the three popular solutions for the treatment of discontinuous phrase structure based theories described above is perfect, and that stems from the inability of PSG to totally separate word order from congurational or dependency issues. The same diculties consequently plague projective Dependency Grammars based on the Hays and Gaifman axiomatization, since these are just notational variants of CFG. But there is a feasible escape route. As will be shown in the next section, all these problems can be solved by employing a non-projective3 formulation of DG, separating dependency relations from surface order.

Separating Dependency Relations And Surface Order

In this section, I will present a formulation of a Dependency Grammar that succeeds in cleanly separating dependency relation and surface order, and by this means also lifts the projectivity condition. The approach described is due to Duchier (1999).

4.1

A non-projective Dependency Grammar

Duchiers (1999) DG can be regarded as a 7-tuple consisting of nite sets (12) DG = Words, Cats, Agrs, Comps, M ods, Lexicon, Rules where Words is a set of strings of fully inected word forms, Cats a set of lexical categories such as V for verb and N for noun, and Agrs a set of agreement tuples such as masc sing 3 nom . The union of Comps and M ods forms the set of all role types Roles (e.g. subject, adv). Furthermore, Lexicon is a set of lexical entries like the one shown in (13): (13)

string cat agr

loves V sing 3 nom subject, object

comps

Finally, Rules is a family of binary predicates for each Roles called role constraints.
3

Interestingly, the founder of modern DG, Tesni`re, has never imposed anything like the projectivity e constraint.

10

4.2

Non-projective Dependency Trees

Duchier (1999) assumes an innite set Nodes of nodes and denes a labeled directed edge to be an element of Nodes Nodes Roles. Hence, given a set V Nodes representing the words of a sentence and a set E V V Roles of labeled edges between these nodes, V, E is a directed graph with labeled edges. For these graphs, Duchier (1999) imposes treeness contraints like those in section 2.2 (Robinsons rst three axioms) or section 2.3 (dependency relation R). Every node of a dependency tree now contributes a word to the sentence, whose position is represented by a mapping index from nodes to integers. Furthermore, every node must be assigned syntactic features, which will be realized by a mapping entry from nodes to lexical entries. A dependency tree is then dened as a 4-tuple: (14) T = V, E, index, entry An analysis of the Latin sentence from section 3.3 in Duchiers (1999) DG is shown in gure 7.
V
adv adj subject genitive

Adv
adj

Adj

Adj
2

ultima Cumaei venit iam carminis aetas


1 3 4 5 6

Figure 7: Analysis of the Latin sentence, edges annotated with functional labels

string ultima cat Adj fem sing 3 nom 1 agr comps {} mods {} string venit cat V fem sing 3 nom 3 agr comps subject adv mods string carminis cat N fem sing 3 gen 5 agr comps {} adj mods

string Cumaei cat Adj fem sing 3 gen 2 agr comps {} mods {} string iam cat Adv 4 agr comps {} mods {}

string aetas cat N fem sing 3 nom 6 agr comps {} genitive,adj mods

11

The outstanding feature of Duchier (1999) is that he denes the well-formedness of dependency trees without reference to word order. In addition to basic treeness constraints, the only constraints he imposes upon dependency structures are valency constraints and role constraints. Roughly, the former express that each complement role (comps, e.g. subject, object) of a node must be licensed by exactly one complement dependent, and that modiers (mods, e.g. adv, adj) can optionally occur as dependents. The latter specify additional restrictions for each Roles, e.g. that adjectives may only modify nouns.

4.3

Separate Specication of Word Order

Nothing has been said as yet about word order in Duchiers (1999) approach. Clearly, even languages like Latin do have some restrictions on word order variation that need to be captured to avoid overgeneration. For example, prepositions must appear before the noun they modify in the linear order of words. An easy solution for specifying word order constraints is by using role constraints: preposition (w1 , w2 ) : i < j would constrain prepositions to stand before their noun dependents. Sometimes, it is more adequate to specify conditions about the linear order of more than two nodes at a time. In this case, role constraints do not suce, since they can only specify binary conditions. For the English language one would for instance wish to express that noun phrases typically look like this: Det < Adj < N , i.e. a noun may be preceded by adjectives and a determiner, but the determiner always precedes the adjectives (if realized). In Duchiers (1999) axiomatization, this requirement can be expressed as shown below: (15) Seq(det(w), adj(w), n(w)) (15) constrains the determiner to always precede the adjectives and the adjectives to precede the noun they modify4 . As can be seen, Duchier (1999) has created a non-projective DG which very cleanly separates dependency relations and word order. Such DGs are able to declaratively and precisely describe grammars of natural languages with any degree of word order variation. Duchier has already developed highly ecient parsers for English and German, applying state-of-the-art constraint technology embedded in the Oz Programming Language (Mozart 1998). And because of the semantic nature of dependency analyses, it is fairly easy to extend Duchier (1999) with a semantics component. In the CHORUSproject at the University of Saarland, an underspecied semantics construction module has been seamlessly integrated into Duchiers parsers.
4

Furthermore, the determiner is optional, and an unrestricted number of adjectives may be placed between it and the noun.

12

Dependency Grammar Formalisms

This section provides an overview of current DG avors, with an emphasis on how these formulations of DG cope with word order variation. Functional Generative Description (Sgall, Hajicova & Panevova 1986) Sgall et al. assume a language-independent underlying order, represented as a projective dependency tree, mapped via ordering rules to the concrete surface realization. The theory is multistratal, distinguishing ve levels of representation. Dependency Unication Grammar (Hellwig 1986) DUG denes a tree-like data structure for the representation of syntactic analyses. The theory is non-projective and handles surface order using positional features. By these, also partial orderings and discontinuities can be handled. Meaning Text Theory (Meluk 1988) c Meluks formalism assumes seven strata of representation, and uses rules for c mapping unordered dependency trees of surface-syntactic representations onto the annotated lexeme sequences of deep-morphological representations. Discontinuities are accounted for by global ordering rules. Word Grammar (Hudson 1990) WG is based on general graphs instead of trees. The ordering of two linked words is specied together with their dependency relation, and extraction of, e.g. objects is analyzed by establishing an additional dependency called visitor between the verb and the extractee. Hence WG does not cleanly separate dependencies from word order. Functional Dependency Grammar (Jrvinen & Tapanainen 1997) a FDG distinguishes between dependency rules and rules for surface linearization. It follows Tesni`res model in not only in being non-projective but also by adopting e Tesni`res notion of nuclei. Nuclei are the primitive elements of FDG structures, e possibly consisting of multiple lexemes. Brker (1998) o Surface order and dependency structures constitute two separate pieces of information. Brker links structurally dissimilar word order domain structures to deo pendency trees to achieve a lexicalized, declarative and formally precise natural language description. This list is not entirely complete, but should provide a rst brief overview of current DG formalisms and their methods for dealing with word order in particular.

13

Conclusion

The aim of this article was to get through to those that think of Dependency Grammar as being inferior to phrase structure based approaches, often because of a lack of familiarity with the theory. Therefore, the basic DG concepts have been presented in section 2, starting from the original intuition and closing with the specication of a formal dependency relation R. The next section (section 3) set DG against phrase structure based theories, beginning with a presentation of the Hays and Gaifman DG and a comparison with Context-Free Grammar. Section 3 went on with discussing the requirement of projectivity, and ended up in proposing to drop this constraint to be able to adequately analyze word order variation. Hereafter, section 4 described Duchiers (1999) axiomatization as a prototypical example of a non-projective DG, before section 5 continued with an overview of current DG avors, with an emphasis on how they treat word order variation. All in all I think that DGs have undeniable advantages for describing languages with a higher degree of word order variation than English. But these advantages can only crop up if one lifts the constraint of projectivity and treats surface order separately from dependency. An argument by Rambow & Joshi (1994), stating that no well-behaved parsers for such DGs exist and that for this reason, non-projective DGs are hampered, can be turned down by mentioning recent advances (e.g. Brker 1998, Duchier 1999, o Jrvinen & Tapanainen 1997) alone. a Last but not least, I wish to thank Denys Duchier, Malte Gabsdil, Christian Pietsch and Stefan Thater for useful criticism and suggestions in the course of writing this article.

References
Brker, N. (1998), Separating surface order and syntactic relations in a dependency o grammar, in COLING-ACL 98 - Proc. of the 17th Intl. Conf. on Computational Linguistics and 36th Annual Meeting of the ACL., Montreal/CAN. Chomsky, N. (1986), Knowledge of Language, Its Nature, Origin, and Use, Praeger, New York/NY. Covington, M. A. (1990), A dependency parser for variable-word-order languages, Research Report AI-1990-01, Articial Intelligence Programs, University of Georgia, Athens/GA. Duchier, D. (1999), Axiomatizing dependency parsing using set constraints, in Sixth Meeting on Mathematics of Language, Orlando/FL. Gaifman, H. (1965), Dependency systems and phrase-structure systems, Information and Control 8(3), 304337. Gazdar, G., Klein, E., Pullum, G. & Sag, I. (1985), Generalized Phrase Structure Grammar, B. Blackwell, Oxford/UK.

14

Hays, D. G. (1964), Dependency theory: A formalism and some observations, Language 40, 511525. Hellwig, P. (1986), Dependency unication grammar, in Proc. of the 11th Int. Conf. on Computational Linguistics, pp. 195198. Hudson, R. A. (1990), English Word Grammar, B. Blackwell, Oxford/UK. Jrvinen, T. & Tapanainen, P. (1997), A dependency parser for english, Technical report, a Department of General Linguistics, University of Helsinki, Helsinki/FIN. Kaplan, R. M. & Bresnan, J. (1982), Lexical-functional grammar: A formal system for grammatical representation, in J. Bresnan & R. Kaplan, eds, The Mental Representation of Grammatical Relations, MIT Press, Cambridge/MA, pp. 173281. Lai, T. B. & Huang, C. (1998), Complements and adjuncts in dependency grammar parsing emulated by a constrained context-free grammar, in Processing of Dependencybased Grammars: Proceedings of the Workshop, COLING-ACL 98, Montral/CAN, pp. 102108. Lombardo, V. & Lesmo, L. (1998), Unit coordination and gapping in dependency theory, in Processing of Dependency-based Grammars: Proceedings of the Workshop, COLING-ACL 98, Montral/CAN, pp. 1120. Meluk, I. (1988), Dependency Syntax: Theory and Practice, State Univ. Press of New c York, Albany/NY. Mozart (1998). http://www.mozart-oz.org/. Mller, S. (1999), Deutsche Syntax deklarativ. Head-Driven Phrase Structure Grammar u fr das Deutsche, number 394 in Linguistische Arbeiten, Max Niemeyer Verlag, u Tbingen. u Pollard, C. & Sag, I. (1994), Head-Driven Phrase Structure Grammar, Univ. of Chicago Press, Chicago/IL. Rambow, O. & Joshi, A. K. (1994), A formal look at dependency grammars and phrase-structure grammars, with special consideration of word-order phenomena, in L. Wanner, ed., Current Issues in Meaning-Text Theory, Pinter, London/UK. Reape, M. (1994), Domain union and word order variation in german. Robinson, J. J. (1970), Dependency structures and transformation rules, Language 46, 259285. Ross, J. R. (1967), Constraints on Variables in Syntax, PhD thesis, MIT. Sgall, P., Hajicova, E. & Panevova, J. (1986), The Meaning of the Sentence in its Semantic and Pragmatic Aspects, D. Reidel, Dordrecht/NL.

15

Tesni`re, L. (1959), Elments de Syntaxe Structurale, Klincksiek, Paris/FRA. e e Uszkoreit, H. (1986), Categorial unication grammar, in COLING 86,, pp. 187194. Uszkoreit, H. (1987), Word Order and Constituent Structure in German, CSLI, Stanford/CA.

16

You might also like