0% found this document useful (0 votes)
8 views

Dependency parsing

Dependency parsing analyzes the syntactic structure of sentences by establishing relationships between words, known as dependencies, which are typically represented as a tree structure. This process is crucial for understanding complex ideas in human communication, as it allows models to interpret language accurately. Various methods, including transition-based and neural dependency parsing, are employed to create these dependency trees from input sentences.

Uploaded by

vineetsuradkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
8 views

Dependency parsing

Dependency parsing analyzes the syntactic structure of sentences by establishing relationships between words, known as dependencies, which are typically represented as a tree structure. This process is crucial for understanding complex ideas in human communication, as it allows models to interpret language accurately. Various methods, including transition-based and neural dependency parsing, are employed to create these dependency trees from input sentences.

Uploaded by

vineetsuradkar
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 32

Dependency Parsing

Dependency Grammar and Dependency Structure:


Phrase structure organizes words into nested constituents
Starting unit: words
the, cat, cuddly, by, door

Words combine into phrases


the cuddly cat, by the door

Phrases can combine into bigger phrases


the cuddly cat by the door
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Dependency Grammar and Dependency Structure:
Dependency structure shows which words depend on (modify, attach to, or
are arguments of) which other words.

Why do we need sentence structure?


 Humans communicate complex ideas by composing words together into
bigger units to convey complex meanings

 A model needs to understand sentence structure in order to be able to


interpret language correctly

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Prepositional phrase attachment ambiguity:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Dependency syntax postulates that syntactic structure consists of
relations between lexical items, normally binary asymmetric
relations (“arrows”) called dependencies,

 Syntactic structure consist of:


Lexical items
Binary asymmetric relations dependencies

 Dependencies are typed with name of grammatical relation

 Dependencies form DEPARTMENT


a Tree,OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Syntactic structure consist of:
Lexical items
Binary asymmetric relations dependencies

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing

 Dependency structure of sentences shows which words depend on (modify


or are arguments of) which other words,

 These binary asymmetric relations between the words are called


dependencies and are depicted as arrows going from the head (or governor,
superior, regent) to the dependent (or modifier, inferior, subordinate).

 The arrow connects a head (governor) and a dependent (modifier)

 Usually these dependencies form a tree structure,


DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
 Dependency tree for the sentence "Bills on ports and immigration were
submitted by Senator Brownback, Republican of Kansas”

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Dependency parsing is the task of analyzing the syntactic dependency
structure of a given input sentence S,

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Dependency relations:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Dependency relations:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Dependency parsing is the task of analyzing the syntactic dependency
structure of a given input sentence S,

 The output of a dependency parser is a dependency tree where the words of


the input sentence are connected by typed dependency relations,

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Dependency Formalisms:
 A dependency structure can be represented as a directed graph G = (V, A),
consisting of a set of vertices V , and a set of ordered pairs of vertices A,
which is called as “arcs”

 The set of vertices, V , corresponds exactly to the set of words in a given


sentence.

The set of arcs, A, captures the head- dependent and grammatical function
relationships between the elements in V .

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Dependency Formalisms:
 The dependency tree is a directed graph that satisfies the following constraints:

1. There is a single designated root node that has no incoming arcs.


2. With the exception of the root node,each vertex has exactly one incoming arc.
3. There is a unique path from the root node to each vertex in V.

 Taken together, these constraints ensure that each word has a single head, that
the dependency structure is connected, and that there is a single root node from
which one can follow a unique directed path to each of the words in the sentence

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Projectivity
 The notion of projectivity imposes an additional constraint that is derived from
the order of the words in the input

 An arc from a head to a dependent is said to be projective, if there is a path from


the head to every word that lies between the head and the dependent in the
sentence.

 A dependency tree is then said to be projective if all the arcs that make it up are
projective.

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Projectivity
 A dependency tree is projective if it can be drawn with no crossing edges.

 In this example, the arc from “flight” to its modifier “was” is non-projective
since there is no path from “flight” to the intervening words “this” and
“morning”. As we can see from this diagram, projectivity (and non-projectivity)
can be detected in the way we’ve been drawing our trees.
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Transition-Based Dependency Parsing:
 Transition-based dependency parsing relies on a state machine which
defines the possible transitions to create the mapping from the input
sentence to the dependency tree.

 The learning problem is to induce a model which can predict the next
transition in the state machine based on the transition history.

 The parsing problem is to construct the optimal sequence of transitions for


the input sentence, given the previously induced model. Most transition-
based systems do not make use of a formal grammar.
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Transition-Based Dependency Parsing:
 This architecture draws on shift-reduce parsing, a paradigm originally
developed for analyzing programming languages

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Transition-Based Dependency Parsing:
 In transition-based parsing we’ll have a stack on which we build
the parse, a buffer of tokens to be parsed, and a parser which
takes actions on the parse via a predictor called an oracle,

 The parser walks through the sentence left-to-right, successively


shifting items from the buffer onto the stack.
At each time point we examine the top two elements on the stack,
and the oracle makes a decision about what transition to apply to
build the parse.
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Transition-Based Dependency Parsing:
The possible transitions correspond to the intuitive actions one might
take in creating a dependency tree by examining the words in a single
pass over the input from left to right

 Assign the current word as the head of some previously seen word,
• Assign some previously seen word as the head of the current word,
• Postpone dealing with the current word, storing it for later processing.

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Transition-Based Dependency Parsing:
We’ll formalize this intuition with the following three transition
operators that will operate on the top two elements of the stack:

 LEFTARC: Assert a head-dependent relation between the word at the top of the
stack and the second word; remove the second word from the stack.
• RIGHTARC: Assert a head-dependent relation between the second word on the
stack and the word at the top; remove the top word from the stack;
• SHIFT: Remove the word from the front of the input buffer and push it onto the
stack.
. DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Transition-Based Dependency Parsing:
 There are some preconditions for using operators.

 The LEFTARC operator cannot be applied when ROOT is the second element
of the stack (since by definition the ROOT node cannot have any incoming
arcs).

 And both the LEFTARC and RIGHTARC operators require two elements to be
on the stack to be applied.
“This particular set of operators implements what is known as the arc standard
approach to transition-based parsing”
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Transition-Based Dependency Parsing:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Transition-Based Dependency Parsing:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Neural Dependency Parsing:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Neural Dependency Parsing:

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Neural Dependency Parsing:
 The network contains an input layer [xw, xt, xl], a hidden layer, and a final
softmax layer,

 We can either define a single weight matrix in the hidden layer, to operate on a
concatenation of [xw, xt, xl], or we can use three weight matrices
[W1w,W1t,W1l], one for each input type, We then apply a non-linear function
and use one more affine layer [W2] so that there are an equivalent number of
softmax probabilities to the number of possible transitions (the output
dimension).

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing

 Neural Dependency Parsing:


 Used pre-trained word embeddings,

 Part-of-speech tags and dependency labels are also represented as vectors,

 Ultimately, the aim of the model is to predict a transition sequence from some
initial configuration “c” to a “terminal” configuration, in which the dependency
parse tree is encoded,

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
 Neural Dependency Parsing:
 As the model is greedy, it attempts to correctly predict one transition

T ∈ {shift, Left-Arcr, Right-Arcr} at a time, based on features extracted from


the current configuration

 c = (σ, β, A).
Recall, σ is the stack,
β the buffer, and
A the set of dependency arcs for a given sentence
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
 Neural Dependency Parsing:
 Depending on the desired complexity of the model, there is flexibility in
defining the input to the neural network. The features for a given sentence S
generally include some subset of:
1. Sword: Vector representations for some of the words in S (and their
dependents) at the top of the stack σ and buffer β.
2. Stag: Part-of-Speech (POS) tags for some of the words in S. POS tags
comprise a small, discrete set: P ={NN,NNP,NNS,DT,JJ,...}
3. Slabel: The arc-labels for some of the words in S.
The arc-labels comprise a small, discrete set, describing the dependency
relation: L = {amod, tmod, nsubj, csubj, dobj, ...}
DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON
Dependency Parsing
Dependency parsing for sentence structure,

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing

Dependency parsing for sentence structure:


 In a written dependency structure, the relationships between each linguistic unit,
or phrase, in the sentence are expressed by directed arcs.

 The root of the tree “prefer” varies the pinnacle of the preceding sentence, as
labelled within the illustration.

 A dependence tag indicates the relationship between two phrases.


For example, the word “flight” changes the meaning of the noun “Denver.”

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON


Dependency Parsing
Dependency parsing for sentence structure:

 As a result flight -> Denver, where flight is the pinnacle and Denver is the kid
or dependent.

 It’s represented by nmod, which stands for the nominal modifier.

 This distinguishes the scenario for dependency between the two phrases, where
one serves as the pinnacle and the other as the dependent.

DEPARTMENT OF INFORMATION TECHNOLOGY, SCOE,KOPARGAON

You might also like