0% found this document useful (0 votes)
16 views27 pages

14 SemanticRep

Uploaded by

Karen Gimena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views27 pages

14 SemanticRep

Uploaded by

Karen Gimena
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
You are on page 1/ 27

CSC 594 Topics in AI –

Natural Language Processing


Spring 2016/17

14. Semantic Representation

(Some slides adapted from Jurafsky & Martin)

1
Meaning Representations
• We’re going to take the same basic approach to
meaning that we took to syntax and morphology
• We’re going to create representations of linguistic
inputs that capture the meanings of those inputs.
• But unlike parse trees, these representations
aren’t primarily descriptions of the structure of
the inputs…

Speech and Language Processing - Jurafsky and Martin 2


Semantic Processing
• Ok, so what does that mean?
• Representations that
– Permit us to reason about their truth (i.e., their relationship to
some world)
– Permit us to answer questions based on their content
– Permit us to perform inference (answer questions and determine
the truth of things we don’t already know to be true)

Speech and Language Processing - Jurafsky and Martin 3


Semantic Processing
• Several ways to attack this problem
– Limited, shallow, practical approaches that have some hope of
actually being useful
• Information extraction
– Principled, theoretically motivated approach…
• Computational/Compositional Semantics
– Chapters 17 and 18
– Something midway that can plausibly serve both purposes
• Semantic role labeling

Speech and Language Processing - Jurafsky and Martin 4


Semantic Analysis
• Compositional Analysis
– Create a logical representation that accounts for all the entities,
roles and relations present in a sentence.

Speech and Language Processing - Jurafsky and Martin 5


Representational Schemes
• We’re going to make use of First Order Logic (FOL) as
our representational framework
– Not because we think it’s ideal
– Many of the alternatives turn out to be either too limiting or
– They turn out to be notational variants

Speech and Language Processing - Jurafsky and Martin 6


FOL
• Allows for…
– The analysis of truth conditions
• Allows us to answer yes/no questions
– Supports the use of variables
• Allows us to answer questions through the use of variable binding
– Supports inference
• Allows us to answer questions that go beyond what we know
explicitly

Speech and Language Processing - Jurafsky and Martin 7


Meaning Structure of Language

• Natural languages convey meaning through the use of


– Predicate-argument structures
– Variables
– Quantifiers
– Compositional semantics

Speech and Language Processing - Jurafsky and Martin 8


Predicate-Argument Structure
• Events, actions and relationships can be captured with
representations that consist of predicates and arguments
to those predicates.
• Languages display a division of labor where some words
and constituents (typically) function as predicates and
some as arguments.

Speech and Language Processing - Jurafsky and Martin 9


Predicate-Argument Structure
• Predicates
– Primarily Verbs, VPs, Sentences
– Sometimes Nouns and NPs
• Arguments
– Primarily Nouns, Nominals, NPs, PPs

Speech and Language Processing - Jurafsky and Martin 10


Example
• Mary gave a list to John.
• Giving(Mary, John, List)
• More precisely
– Gave conveys a three-argument predicate
– The first argument is the subject
– The second is the recipient, which is conveyed by the NP inside
the PP
– The third argument is the thing given, conveyed by the direct
object

Speech and Language Processing - Jurafsky and Martin 11


Note
• Giving(Mary, John, List) is pretty much the same as
– Subj(Giving, Mary), Obj(Giving, John), IndObj(Giving, List)
– Which should look an awful lot like.... what?

Speech and Language Processing - Jurafsky and Martin 12


Better
• Turns out this representation isn’t quite as useful as
it could be.
• Better would be

Speech and Language Processing - Jurafsky and Martin 13


Predicates
• The notion of a predicate just got more complicated…
• In this example, think of the verb/VP providing a
template like the following

• The semantics of the NPs and the PPs in the sentence


plug into the slots provided in the template

Speech and Language Processing - Jurafsky and Martin 14


Two Issues
• How can we create this • What makes that
kind of representation in representation a
a principled and efficient “meaning”
way representation, as
opposed say to a parse
tree?

Speech and Language Processing - Jurafsky and Martin 15


Semantic Analysis
• Semantic analysis is the process of taking in some
linguistic input and assigning a meaning representation
to it.
– There a lot of different ways to do this that make more or less
(or no) use of syntax
– We’re going to start with the idea that syntax does matter
• The compositional rule-to-rule approach

Speech and Language Processing - Jurafsky and Martin 16


Compositional Analysis
• Principle of Compositionality
– The meaning of a whole is derived from the meanings of the
parts
• What parts?
– The constituents of the syntactic parse of the input
• What could it mean for a part to have a meaning?

Speech and Language Processing - Jurafsky and Martin 17


Example
• Franco likes Frasca.

Speech and Language Processing - Jurafsky and Martin 18


Compositional Analysis

Speech and Language Processing - Jurafsky and Martin 19


Augmented Rules
• We’ll accomplish this by attaching semantic formation
rules to our syntactic CFG rules
• Abstractly

• This should be read as the semantics we attach to A


can be computed from some function applied to the
semantics of A’s parts.

Speech and Language Processing - Jurafsky and Martin 20


Example

• Easy parts… • Attachments


– NP -> PropNoun {PropNoun.sem}
– PropNoun -> Frasca {Frasca}
– PropNoun -> Franco {Franco}

Speech and Language Processing - Jurafsky and Martin 21


Lambda Forms
• A simple addition to
FOL
– Take a FOL sentence
with variables in it that
are to be bound.
– Allow those variables
to be bound by
treating the lambda
form as a function
with formal arguments

Speech and Language Processing - Jurafsky and Martin 22


Compositional Semantics by
Lambda Application

Speech and Language Processing - Jurafsky and Martin 23


Lambda Applications and
Reductions

Speech and Language Processing - Jurafsky and Martin 24


Lambda Applications and
Reductions

VP VP  Verb NP {Verb.sem(NP.Sem)

Frasca

Speech and Language Processing - Jurafsky and Martin 25


Complex NPs
• Things get quite a bit more complicated when we start
looking at more complicated NPs
– Such as...
• A menu
• Every restaurant
• Not every waiter
• Most restaurants
• All the morning non-stop flights to Houston

Speech and Language Processing - Jurafsky and Martin 26


Quantifiers
• Contrast...
– Frasca closed

• With
– Every restaurant closed

Speech and Language Processing - Jurafsky and Martin 27

You might also like