0% found this document useful (0 votes)
52 views42 pages

Unit 2

The document discusses the components of natural language processing including syntactic analysis and semantic analysis. It outlines the learning objectives for the session on syntactic analysis, including grammar formalisms, tree banks, and efficient parsing for context free grammars. Semantic parsing topics include lexical semantics, word-sense disambiguation, and computational semantics.

Uploaded by

20bq1a4213
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
52 views42 pages

Unit 2

The document discusses the components of natural language processing including syntactic analysis and semantic analysis. It outlines the learning objectives for the session on syntactic analysis, including grammar formalisms, tree banks, and efficient parsing for context free grammars. Semantic parsing topics include lexical semantics, word-sense disambiguation, and computational semantics.

Uploaded by

20bq1a4213
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 42

DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

NATURAL LANGUAGE
PROCESSING

4/26/2023 1
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

NATURAL LANGUAGE
PROCESSING
UNIT-II
SYNTATIC ANALYSIS
SEMANTIC ANALYSIS

4/26/2023 2
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


This Session
Syntactic Analysis: Learning Objectives

I Syntactic Parsing:
1. Grammar formalisms.
2.Tree banks,
3. Efficient parsing for Context Free
grammars (CFG).
4.Statistical Context Free Grammars
5.Probabilistic Context Free Grammars
(PCFG)
6. Lexicalized PCFGs.

.II Semantic Parsing:


7. Lexical semantics.
8. word-sense disambiguation.
9.Computational semantics,
10.semantic Role,
11. labelling and semantic Parsing.

4/26/2023 3
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Introduction to NLP:
This Session
Learning Objectives
 NLP would be to program computers for analyzing and processing
1. Introduction
huge amount of natural language data. 2.Motivations
3.NLP?
4.Roadmap for NLP
 It is a challenge for us to develop NLP applications because 5. NLP Components
computers need structured data, but human speech is unstructured 6.NLP Issues
7. NLP Techniques
and often ambiguous in nature. 8.NLP Applications.
9. The role of Deep Learning in
 Study of Human Languages, Language is a crucial component for Natural Language Processing
human lives and also the most fundamental aspect of our behavior. 10. Backpropagations, recurrent
neural networks, Transformers.
 Written and Spoken. .

4/26/2023 4
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP - Motivation :
This Session
The goal is to show you what is possible with current NLP techniques. Learning Objectives
It inspire you to use some of these applications for your own.
1. Introduction
2.Motivations
3.NLP?
1. Automatic text summarization : 4.Roadmap for NLP
https://www.summarizebot.com/text_api_demo.html 5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
2. Question answering : 9. The role of Deep Learning in
https://twitter.com/paraschopra/status/1284801028676653060?ref_src=t Natural Language Processing
wsrc%5Etfw%7Ctwcamp%5Etweetembed%7Ctwterm%5E12848010286 10. Backpropagations, recurrent
76653060%7Ctwgr%5E89ef6d98baf71853523d6979e82ddbbc35d094f2 neural networks, Transformers.
.
%7Ctwcon%5Es1_&ref_url=https%3A%2F%2Ffanyv88.com%3A443%2Fhttps%2Fcdn.iframe.ly%2FyFGb
ULo%3Fapp%3D1

5. Text classification 5
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP - Motivation :
This Session
The goal is to show you what is possible with current NLP techniques. Learning Objectives
It inspire you to use some of these applications for your own.
1. Introduction
2.Motivations
3.NLP?
3. Information extraction 4.Roadmap for NLP
5. NLP Components
https://demos.explosion.ai/displacy-ent 6.NLP Issues
7. NLP Techniques
8.NLP Applications.
4. Text classification -- 9. The role of Deep Learning in
Natural Language Processing
https://dandelion.eu/semantic-text/entity-extraction- 10. Backpropagations, recurrent
demo/?text=The+Mona+Lisa+is+a+sixteenth+century+oil+painting+created+by+
neural networks, Transformers.
Leonardo.+It%27s+held+at+the+Louvre+in+Paris.&lang=en&min_confidence=0
.
.6&exec=true#results

5. Text classification 6
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP - Motivation :
This Session
The goal is to show you what is possible with current NLP techniques. Learning Objectives
It inspire you to use some of these applications for your own.
1. Introduction
2.Motivations
3.NLP?
5. Machine Translation -- 4.Roadmap for NLP
5. NLP Components
https://pbs.twimg.com/media/EdcIrZyXkAADjcG?format=png&name=la 6.NLP Issues
rge 7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
6. Write Code using natural language Natural Language Processing
10. Backpropagations, recurrent
ex: https://twitter.com/i/status/1282676454690451457 neural networks, Transformers.
.

5. Text classification 7
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP Components:
NLP=NLU+NLG This Session
Learning Objectives
NLP identifies and processes the most significant data and structures it into text, numbers, or computer
language
1. Introduction
· NLU understands the human language and converts it into data 2.Motivations
· NLG uses the structured data and generates meaningful narratives out of it 3.NLP?
4.Roadmap for NLP
5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 8
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Road Map:
This Session
Learning Objectives

1. Introduction
2.Motivations
3.NLP?
4.Roadmap for NLP
5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 9
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP Pipeline:
This Session
Learning Objectives

1. Introduction
2.Motivations
3.NLP?
4.Roadmap for NLP
5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 10
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP Pipeline:
This Session
Learning Objectives

1. Introduction
2.Motivations
3.NLP?
4.Roadmap for NLP
5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 11
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


NLP Applications:
 Automatic text summarization This Session
Learning Objectives
 Question answering
1. Introduction
 Information extraction 2.Motivations
3.NLP?
 Text classification 4.Roadmap for NLP
 Machine Translation 5. NLP Components
6.NLP Issues
 Write Code using natural language 7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 12
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Deep Learning Role:
• Deep learning model can learn word meaning and perform language tasks, evading the need for
performing complex language tasks. This Session
Learning Objectives
• A variety of deep learning models have been applied to natural language processing (NLP) to:
• Improve,
1. Introduction
• Accelerate,
2.Motivations
• Automate the Text Analytics functions and 3.NLP?
• Other NLP features. 4.Roadmap for NLP
5. NLP Components
• Deep learning methods are being applied in the field of natural language processing, 6.NLP Issues
achieving state-of-the-art results for most language problems. 7. NLP Techniques
8.NLP Applications.
DL based Applications: 9. The role of Deep Learning in
• Tokenization and Text Classification Natural Language Processing
• Generating Captions for Images 10. Backpropagations, recurrent
• Speech Recognition neural networks, Transformers.
• Machine Translation .
• Question Answering (QA)
• Document Summarization

4/26/2023 13
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Deep Learning Role:
• Deep learning model can learn word meaning and perform language tasks, evading the need for
performing complex language tasks. This Session
Learning Objectives
• A variety of deep learning models have been applied to natural language processing (NLP) to:
• Improve,
1. Introduction
• Accelerate,
2.Motivations
• Automate the Text Analytics functions and 3.NLP?
• Other NLP features. 4.Roadmap for NLP
5. NLP Components
• Deep learning methods are being applied in the field of natural language processing, 6.NLP Issues
achieving state-of-the-art results for most language problems. 7. NLP Techniques
8.NLP Applications.
DL based Applications: 9. The role of Deep Learning in
• Tokenization and Text Classification Natural Language Processing
• Generating Captions for Images 10. Backpropagations, recurrent
• Speech Recognition neural networks, Transformers.
• Machine Translation .
• Question Answering (QA)
• Document Summarization

4/26/2023 14
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


UNIT-II

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 15
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


SEMANTIC ANALYSIS

• Helps to Understanding the Natural Language This Session


Learning Objectives

• Draw exact meaning and context of the text a) . Syntactic Analysis


1 Grammars, Formalisms
• Inform the logical meaning of certain given sentences or Parts of these semantics
2. CFG,PCFG
b) .Semantic Analysis?
• Used to Understand the meaning and interpretation of words, signs and 3. Lexical Semantic
4.Word-Sense Disambuigity
• Sentence structure 5.Compositional semantics
6.Semantic Role Labeling
• Identifying relationships between individual words in particular context 7.Semantic Parsing
Example : Ram is Great. (Person , Lord , H/W) .

4/26/2023 16
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 17
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 18
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 19
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

1. Introduction
2.Motivations
3.NLP?
4.Roadmap for NLP
5. NLP Components
6.NLP Issues
7. NLP Techniques
8.NLP Applications.
9. The role of Deep Learning in
Natural Language Processing
10. Backpropagations, recurrent
neural networks, Transformers.
.

4/26/2023 20
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 21
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 22
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 23
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Word-sense disambiguation (WSD) is the process of identifying which sense of a word is meant
in a sentence or other segment of context.
This Session
Many techniques have been researched, including dictionary-based methods that use the Learning Objectives
knowledge encoded in lexical resources,
a) . Syntactic Analysis
In supervised machine learning methods in which a classifier is trained for each distinct word on
1 Grammars, Formalisms
a corpus of manually sense-annotated examples.
2. CFG,PCFG
In English, accuracy at the coarse-grained (homograph) level is routinely above 90% (as of b) .Semantic Analysis?
2009), with some methods on particular homographs achieving over 96% 3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 24
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Semantic role labeling
In natural language processing, semantic role labeling is the process that assigns labels to
words or phrases in a sentence that indicates their semantic role in the sentence, such as that of This Session
an agent, goal, or result. Learning Objectives

 It serves to find the meaning of the sentence. a) . Syntactic Analysis


 It detects the arguments associated with the predicate or verb of a sentence and how they are 1 Grammars, Formalisms
classified into their specific roles
2. CFG,PCFG
 Ex: “ Mary sold the book to John.“ b) .Semantic Analysis?
The agent is "Mary," 3. Lexical Semantic
the predicate is "sold" (or rather, "to sell,") 4.Word-Sense Disambuigity
the theme is "the book," and 5.Compositional semantics
the recipient is "John."
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 25
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Labeling Uses :

Semantic role labeling is mostly used for machines to understand the roles of words within This Session
sentences. Learning Objectives

A better understanding of semantic role labeling could lead to advancements in a) . Syntactic Analysis
question answering, information extraction, 1 Grammars, Formalisms
automatic text summarization, text data mining, and speech recognition
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 26
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

Word-sense disambiguation (WSD) Difficulties :


This Session
 Differences between dictionaries -- having different dictionaries and thesauruses Learning Objectives

 Different algorithms for different applications – different algorithms used for different a) . Syntactic Analysis
applications of NLP
1 Grammars, Formalisms
 Word-sense discreteness ---Another difficulty in WSD is that words cannot be easily divided
into discrete sub meanings. 2. CFG,PCFG
b) .Semantic Analysis?
 Inter-judge variance : it is often impossible for individuals to memorize all of the senses a 3. Lexical Semantic
word can take. 4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 27
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Semantic parsing is the task of converting a natural language utterance to a logical form.

Logical Form in the sense : a machine-understandable representation of its meaning. This Session
Learning Objectives
Semantic parsing can thus be understood as extracting the precise meaning of an utterance.
The Applications of semantic parsing include a) . Syntactic Analysis
 machine translation, 1 Grammars, Formalisms
 question answering, 2. CFG,PCFG
 ontology induction,[
 automated reasoning, and b) .Semantic Analysis?
 code generation. 3. Lexical Semantic
4.Word-Sense Disambuigity
The phrase was first used in the 1970s by Yorick Wilks as the basis for machine translation 5.Compositional semantics
programs working with only semantic representations. 6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 28
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Types Semantic parsing :
 Shallow Semantic Parsing
 Deep Semantic Parsing This Session
Shallow : Learning Objectives
• Shallow semantic parsing is concerned with identifying entities in an utterance and labelling
them with the roles they play. a) . Syntactic Analysis
1 Grammars, Formalisms
• Shallow semantic parsing is sometimes known as slot-filling or frame semantic parsing.
2. CFG,PCFG
• since its theoretical basis comes from frame semantics, wherein a word evokes a frame of b) .Semantic Analysis?
related concepts and roles. 3. Lexical Semantic
4.Word-Sense Disambuigity
• Slot-filling systems are widely used in virtual assistants in conjunction with intent 5.Compositional semantics
classifiers, which can be seen as mechanisms for identifying the frame evoked by an
6.Semantic Role Labeling
utterance.
7.Semantic Parsing
• Popular architectures for slot-filling are largely variants of an encoder-decoder model, .
wherein two recurrent neural networks (RNNs) are trained jointly to encode an utterance
into a vector and to decode that vector into a sequence of slot labels

4/26/2023 29
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Shallow Type Semantic Parsing :

Real time application using this Semantic Parsing Technique : This Session
Learning Objectives
Amazon Alexa spoken language understanding system.
a) . Syntactic Analysis
Ex: Shallow semantic parsers can parse utterances like
1 Grammars, Formalisms
"show me flights from Boston to Dallas" 2. CFG,PCFG
b) .Semantic Analysis?
-- by classifying the intent as "list flights", and filling slots "source" and "destination" 3. Lexical Semantic
with "Boston" and "Dallas",
4.Word-Sense Disambuigity
respectively
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 30
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Deep :
Deep semantic parsing, also known as compositional semantic parsing
This Session
Which is concerned with producing precise meaning representations of utterances that can Learning Objectives

contain significant compositionality. a) . Syntactic Analysis


1 Grammars, Formalisms
Ex: Shallow semantic parsing cannot parse arbitrary compositional utterances, like
2. CFG,PCFG
"show me flights from Boston to anywhere that has flights to Juneau“ b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
Deep semantic parsing attempts to parse such utterances, typically by converting them to a 5.Compositional semantics
6.Semantic Role Labeling
formal meaning representation language
7.Semantic Parsing
.

4/26/2023 31
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Natural Language Discourse Processing

The most difficult problem of AI is to process the natural language by computers or in other This Session
words natural language processing is the most difficult problem of artificial intelligence. Learning Objectives

One of the major problems in NLP is discourse processing , which is one of the major a) . Syntactic Analysis
problems in NLP is discourse processing − building theories and models of how utterances 1 Grammars, Formalisms
stick together to form coherent discourse.
2. CFG,PCFG
The language always consists of collocated, structured and coherent groups of sentences rather b) .Semantic Analysis?
than isolated and unrelated sentences. 3. Lexical Semantic
4.Word-Sense Disambuigity
These coherent groups of sentences are referred to as discourse. 5.Compositional semantics
6.Semantic Role Labeling
Coherence and discourse structure are interconnected in many ways,
7.Semantic Parsing
.

4/26/2023 32
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Natural Language Discourse Processing

The coherent discourse must possess the following properties − This Session
Learning Objectives
• Coherence relation between utterances –meaningful connection between utterances
a) . Syntactic Analysis
• Relationship between entities -- there must be a certain kind of relationship with the entities 1 Grammars, Formalisms
2. CFG,PCFG
Discourse structure:
It Depends upon the segmentation we applied on discourse, b) .Semantic Analysis?
-- this may be defined as determining the types of structures for large discourse. 3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 33
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Algorithms for Discourse Segmentation:

 Unsupervised Discourse Segmentation:- This Session


 Often represented as linear segmentation Learning Objectives
 The text into multi-paragraph units; the units represent the passage of the original text.
 Use of certain linguistic devices to tie the textual units together. a) . Syntactic Analysis
1 Grammars, Formalisms
 Supervised Discourse Segmentation:-
2. CFG,PCFG
 needs to have boundary-labeled training data
 discourse marker or cue words play an important role b) .Semantic Analysis?
 is a word or phrase that functions to signal discourse structure. 3. Lexical Semantic
 These discourse markers are domain-specific. 4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 34
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Text Coherence
Lexical repetition is a way to find the structure in a discourse,
But it does not satisfy the requirement of being coherent discourse This Session
Learning Objectives
To achieve the coherent discourse, we must focus on coherence relations in specific.
a) . Syntactic Analysis
Coherence properties : 1 Grammars, Formalisms
2. CFG,PCFG
Result
We are taking two terms S0 and S1 to represent the meaning of the two related sentences − b) .Semantic Analysis?
3. Lexical Semantic
Ex : Ram was caught in the fire. His skin burned. 4.Word-Sense Disambuigity
5.Compositional semantics
Parllel 6.Semantic Role Labeling
It infers p(a1,a2,…) from assertion of S0 and p(b1,b2,…) from assertion S1. Here ai and bi are
similar for all i. For example, two statements are parallel − 7.Semantic Parsing
ex :Ram wanted car. Shyam wanted money. .

4/26/2023 35
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Text Coherence

Coherence properties : This Session


Learning Objectives
Elaboration
It infers the same proposition P from both the assertions − S0 and S1 For example, two a) . Syntactic Analysis
statements show the relation elaboration: 1 Grammars, Formalisms
2. CFG,PCFG
Ram was from Chandigarh. Shyam was from Kerala.
Occasion b) .Semantic Analysis?
It happens when a change of state can be inferred from the assertion of S0, final state of which 3. Lexical Semantic
can be inferred from S1 and vice-versa. 4.Word-Sense Disambuigity
5.Compositional semantics
For example, the two statements show the relation occasion: 6.Semantic Role Labeling

Ram picked up the book. He gave it to Shyam. 7.Semantic Parsing


.

4/26/2023 36
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Text Coherence

Building Hierarchical Discourse Structure: This Session


Learning Objectives
The coherence of entire discourse can also be considered by hierarchical structure between
coherence relations. a) . Syntactic Analysis
For example, the following passage can be represented as hierarchical structure − 1 Grammars, Formalisms
S1 − Ram went to the bank to deposit money.
2. CFG,PCFG
S2 − He then took a train to Shyam’s cloth shop.
S3 − He wanted to buy some clothes. b) .Semantic Analysis?
S4 − He do not have new clothes for party. 3. Lexical Semantic
S5 − He also wanted to talk to Shyam regarding his health 4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 37
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Text Coherence

For example, the following passage can This Session


be represented as hierarchical structure − Learning Objectives

S1 − Ram went to the bank to deposit a) . Syntactic Analysis


money. 1 Grammars, Formalisms
2. CFG,PCFG
S2 − He then took a train to Shyam’s cloth
shop. b) .Semantic Analysis?
3. Lexical Semantic
S3 − He wanted to buy some clothes. 4.Word-Sense Disambuigity
5.Compositional semantics
S4 − He do not have new clothes for
6.Semantic Role Labeling
party.
7.Semantic Parsing
S5 − He also wanted to talk to Shyam .
regarding his health

4/26/2023 38
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Reference Resolution

Interpretation of the sentences from any discourse is another important task and to achieve this This Session
we need to know who or what entity is being talked about. Learning Objectives

For Ex: “Ram, the manager of ABC bank, saw his friend Shyam at a shop. He went to meet him”, a) . Syntactic Analysis
the linguistic expressions like Ram, His, He are reference. 1 Grammars, Formalisms

Reference resolution may be defined as the task of determining what entities are referred to by 2. CFG,PCFG
which linguistic expression. b) .Semantic Analysis?
Terminology Used in Reference Resolution 3. Lexical Semantic
Referring expression − The natural language expression that is used to perform reference is 4.Word-Sense Disambuigity
called a referring expression. 5.Compositional semantics
Referent − It is the entity that is referred. Ex: in the last given example Ram is a referent.
6.Semantic Role Labeling
Corefer − used to refer to the same entity, they are called corefers. Ex: Ram and he are corefers.
Antecedent − The term has the license to use another term. Ex: Ram is the antecedent of the 7.Semantic Parsing
reference he. .
Anaphora & Anaphoric − The reference to an entity that has been previously introduced into
the sentence.

4/26/2023 39
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Types of Referring Expression:

Indefinite Noun Phrases: This Session


The reference represents the entities that are new to the hearer into the discourse Learning Objectives
context
Ex: Ram had gone around one day to bring him some food. a) . Syntactic Analysis
Definite Noun Phrases 1 Grammars, Formalisms
Reference represents the entities that are not new or identifiable to the hearer into the
2. CFG,PCFG
discourse context.
Ex: I read Times of India , or The Hindhu or NewYorkTimes. b) .Semantic Analysis?
Pronouns : It is a form of definite reference 3. Lexical Semantic
Ex: Ram laughed as loud as he could 4.Word-Sense Disambuigity
5.Compositional semantics
Demonstratives : These demonstrate and behave differently than simple definite pronouns.
6.Semantic Role Labeling
Ex:-This and that are demonstrative pronouns
Names: It is the simplest type of referring expression 7.Semantic Parsing
Ex: Ram is the name-refereeing expression. .

4/26/2023 40
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING


Reference Resolution Tasks:

Coreference Resolution --- finding referring expressions in a text that refer to the same entity. This Session
A set of coreferring expressions are called coreference chain. Learning Objectives
For example - He, Chief Manager and His …..
a) . Syntactic Analysis
Constraint on Coreference Resolution- 1 Grammars, Formalisms
It can refer much like he and she
2. CFG,PCFG
The pronoun it also refers to the things that do not refer to specific things.
It’s raining. It is really good. b) .Semantic Analysis?
3. Lexical Semantic
Pronominal Anaphora Resolution:- 4.Word-Sense Disambuigity
pronominal anaphora resolution may be defined as the task of finding the antecedent for a single 5.Compositional semantics
pronoun.
6.Semantic Role Labeling

The pronoun is his and the task of pronominal anaphora resolution is to find the word Ram 7.Semantic Parsing
because Ram is the antecedent. .

4/26/2023 41
DEPARTMENT OF CSE-AIML (CSM) – III-II Sem

NATURAL LANGUAGE PROCESSING

This Session
Learning Objectives

a) . Syntactic Analysis
1 Grammars, Formalisms
2. CFG,PCFG
b) .Semantic Analysis?
3. Lexical Semantic
4.Word-Sense Disambuigity
5.Compositional semantics
6.Semantic Role Labeling
7.Semantic Parsing
.

4/26/2023 42

You might also like