0% found this document useful (0 votes)
42 views8 pages

Natural Language Processing

Uploaded by

dapoligbcp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
42 views8 pages

Natural Language Processing

Uploaded by

dapoligbcp
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 8

SECTION 1

• What is NLP? What are the applications of NLP?


• Explain the stages of NLP. x 2
• Explain the generic NLP system.
• Explain the knowledge level.
• Explain the challenges in NLP
• De ne ambiguity. Explain di erent types of ambiguity with examples. Also (Di erentiate between
syntactic Ambiguity and lexical Ambiguity)

SECTION 2

• What is morphology processing? Explain di erent types of morphologies.


• Explain N-gram model. Explain how n-gram model used for spelling correction.
• Explain Lexicon-free FST Porter stemmer. Explain evaluation of N-grams using perplexity.
• Explain Morphological parsing with nite state transducer.
• Explain and exercise Good turing discounting
• What is meant by Laplace or Add one smoothing?
• De ne Regular relation. Give an example.
• Porter stemmer algorithms x 2

Corpus di erent sum

SECTION 3

• Explain the methods of POS tagging. (Maximum Entropy Model)


• What is constituent and dependency parsing? Explain CKY algorithm with an example.
• Explain the concept of Parsing and PCFG
• What is sequence labeling? Explain Hidden Markov model with example.
• Explain MEMM and CRF(conditional random eld)
• Hidden Markov Model Limitation and Sum
SECTION 4

• Explain how meaning is represented in semantic analysis.


• Explain unsupervised Hyperlex method to resolve WSD.
• What is Word sense disambiguation? Illustrate with an example how the dictionary based
approach identi es the correct sense of an ambiguous word. x2
• What is lexicon and lexeme? Explain the relations among lexemes and their senses with example.
• Explain WordNet with example.
• Explain how a supervised algorithm can be applied for WSD

SECTION 5

• HOBBS ALGORITHM
• CENTERING ALGORITHM

SECTION 6

• Information Retrieval
• Sentiment Analysis
• MAchine Translation

INFORMATION RETRIVAL
8

8
47
E2

36

77
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

E
25
8F

7A
E8

83
66
E2

77
24

54
F3

E
7A
73

E8

62

83
28
A6

36

77
4

54
E
2

A
3

62
23
Time: 3 hours Max. Marks: 80

67

7
2

36
1D

54
A

E
32

F
8

62
3
N.B. (1) Question No. 1 is compulsory

C5

67

4E

8
2

E2

36
D

4
EE

32

25
1
(2) Assume suitable data if necessary

8F
E8
23
5
83

67

66
C

E2
1D

24
EE
77

F3
(3) Attempt any three questions from the remaining questions

E8
3
C5
7A

83

8
2

E2
1D

24
EE
77

A
54

E8
3
5
A
62

83

7
2
C

A6
Q.1 Solve any Four out of Five

7
5 marks each

24
E
36

77

8E
4

1
3E
5

73
3
C5
8F

A
2

4E
D2
a Explain the challenges of Natural Language processing.

66

A6
7

E
E2

7
54

32
1
3

23
5
F

7A
E8

62

7
b Explain how N-gram model is used in spelling correction

C
28

A6
1D
E
6

77
24

54
E

F3

23
5
7A
73

8
c Explain three types of referents that complicate the reference resolution problem.

62

83

EC
E

28

D
A6

36

7
4

3A
4

51
E

A7

3E
2

25
d Explain Machine Translation Approaches used in NLP.

F
3

8
23

D2
EC
67

4E

78
7
E2

6
1D

51
A

Explain the various stages of Natural Language processing.

3
e

3E
2

5
8F

A
3

E8

2
3
C5

EC
67

6
2

78
7
E2

36
D

4
EE

A7

3E
32

5
1

F
E8

62
3
5
83

67

8
2

78
C

47
Q.2 10 marks each

36
1D

24
EE
77

8E

A7
25
8F
3
3
C5
7A

83

66
D2

a What is Word Sense Disambiguation (WSD)? Explain the dictionary based approach to

47
A6

E2
4
EE
77
54

F3
2

25
1

73

8
23
5
A

Word Sense Disambiguation.


62

83

28

66
C

A6
7

1D

24
EE
36

77

E
54

F3
3

E8
b
23

Represent output of morphological analysis for Regular verb, Irregular verb, singular noun,
C5
8F

A
2

28
66

6
7

24
E
E2

3A

8E
54

plural noun Also Explain Role of FST in Morphological Parsing with an example
F3

73
5
7A
E8

62

83

4E
2
EC
8

6
D
E2

77
24

A
54

32
1
F3

3E

3
5
7A
73

E8

62

67
2
EC
8

1D

Q.3 10 marks each


A6

E2

36

77
24

3A
54

3E

5
F

A
3

2
23

D2
EC
67

4E

78

a Explain the ambiguities associated at each level with example for Natural Language
7
E2

6
1D

51
A

3E
32

5
8F

A
E8

62
23
C5

processing.
EC
7

78
47
A6

E2

36
1D

4
EE

A7

3E
32

25
F
E8

b Explain Discourse reference resolution in detail.


23
C5

78
47
A6

E2

36
1D

24
EE

A7
25
8F
3

8
23
C5
83

66

47
A6

E2
1D

4
EE
77

Q.4 10 marks each


F3
2

25
73

E8
23
5
7A

83

28

66
EC

A6
1D

24

a
77

8E
54

F3
3E

3
23
5
7A

28
EC
78

A6
D

24

8E
54

1
7

3E

73
3
5
7A
62

4E
2
EC
78

6
D
36

A
54

32
1
A7

3E

23
5
8F

62

67
EC
78
47

1D
E2

36

3A
A7

3E
25

5
8F

D2
EC
66

78
47
E2

51
3

3E
25
8F

A
E8

EC
66

78
47
2
24

8E

F3

A7

3E
25
73

4E

28

66

78
47
A6

8E

F3

A7
32

25
67

4E

28

66

47
3A

8E

F3
2

25
73

4E

28

66
D2

30651 Page 1 of 2
3A

8E

F3
32
51

67

4E

28
D2
EC

3A

8E
32
51

67

4E
D2
EC

3A

EC51D23A67324E8E28F3662547A7783E
32
51
EC A6 E2 54 EE
51 73 8F 7A C5
D2 24 36 77 1D
3A E8 62 83 23
EC 67 E2 54 EE A6
51 32 8F 7A C5 73
D2 4E 36 24 77 1D
3A 8E 62 E8 83 23
6 73 2 8F
54
7A
EE
C5 A 67 E2
51 2 3 7 1D 3 8F
24 36

b
4E

a
D2 66 78

b
Q.5

Q.6
3A 8E 25 3E 23 E8 62

30651
67 28 47 EC A6 E2 54
32 F3 A7 5 1D 732 8F3 7A
4E 66 78 4E 66 77
3A 8E 25 3E 23 8 2 83
67 28 47 EC A6 E2 54 EE
7

Rules:
32 F3 A7 5 1D 7 32 8 F3 A C5
4E
8E
66
25
78
3E 23 4 E8 6 62
77
83 1D
47 A6 E2 54 EE 23
28 A
EC
8 7 A6
32 F3 7 5 1 7 3 F A C 5 73

10 marks each

10 marks each
4E 66 78 D 24 36 77 1D 24
8E 25 3E 23 E8 62 83 2 3 E8
28 47 EC A6 E2 54
7A
EE A E2
F3 A7 5 1 73
2
8F
3 7
C5 67
3 8F
66 78 D 4 6 7 1D 2
For given above corpus,

25 3E 2 3 E8 6 2 8 3 2 4E
36
3 8 62
M: Modal verb [can , will]

A
V:Verb [ watch, spot, pat]

47 EC A6 E2 54 EE E 54

“The man read this book”


A7 5 7 3 8 F 7A C 5 67 28
F 7A
78 1D 24 36 77 1 3 2 3 77
3E 23 E8 62 83 D2 4E 66 83
EC A 6 E2 5 4 E E 3 A 8E 25
7 4 EE
Statement is “Justin will spot Will”

51 73
24
8F A 77
C5 67
32
28
F3 7A C5
36 1D 4 6 77
D2
3A E 8E 6 25 8 3E 23 E8 62 83 1D
23
N: Noun [Martin, Justin, Will, Spot, Pat]

67 28 47 EC A6 E2 54 EE
32 A 7 5 7 3 8F 7 A C A6
F3 1 51 73

Explain Porter Stemmer algorithm with rules


D 24 36 77

Page 2 of 2
4E 66 78 D 24
8E 25 3E 23 E8 62 83 23 E8
28 47 EC A6 E2 54
7A
EE A E2
F3 A7 5 1 73
2
8F
3 7
C5 67
3 8
D 2

Explain Maximum Entropy Model for POS Tagging


66
25
78
3E 2 4E 66 7 8 1D 4
23

EC51D23A67324E8E28F3662547A7783E
3A 8E 25 3E E8
47 2 4 E A
Create Transition Matrix & Emission Probability Matrix

EC
5 6 7 8 7A C 67 E2
A7
78 1D 324 F 36 77 51D 32 8F
3E 2 3 E 8 6 2 83 2 4E
36
6
A 6 E2 54 E E 3 A 8E
Describe in detail Centering Algorithm for reference resolution.

EC 7 67 28
51 73 8F A7 C5
D2 24 36 7 1 D 32 F3

Explain information retrieval versus Information extraction systems


E8 62 83 2 4E 66
3A E 5 4 E E 3 A 8E 25
7 6 2
Apply Hidden Markov Model and do POS tagging for given statements

67 28 C
For a given grammar using CYK or CKY algorithm parse the statement

A 7 8 47
32
4E
F3 7 78 5 1 324 F 36
66 D2
8E
28
25
47
3E
EC 3A6
E8
E2
62
54
F3 A7 51D 73 8 F 7A
66 78 24 36 77
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

25 3E 23 E8 62 8
47 EC A6 E2 54
A7 51 73 8F 7A
78
3E D 23
24
E8
36
62
77
83
EC A6 E2 54 E
51 7 32 8 F3 7 A
D2 4E 66 77
83
3A 8E 25
4 E
D0
A8
9F

8
F5

A8
Paper / Subject Code: 42175 / NATURAL LANGUAGE PROCESSING (DLOC - III)

44
B8
A2
37

D0
A8
9F
E7

F5

A8
44
B8
A2
8C

37

D0
A8
9F
E7

F5
06
Time: 3 Hours Max. Marks: 80

44
B8
A2
C

7
91
=====================================================================

A8
8

9F
E7
DB

5
6

F
10
N.B. (1) Question No. 1 is compulsory

B8
A2
8C

7
2E

3
9

9F
E7
DB

5
(2) Assume suitable data if necessary

A
06
82

8
A2
C

37
1

FB
A8

E
(3)Attempt any three questions from remaining questions

8
2

E7
DB

F5
06
82

9
D0

A2
8C

37
1
A8

9
44

7
B

F5
6
82

E
0
A8

0
ED
D

8C

37
1
A8
Q.1 Any Four 20[M]

9
44
B8

E7
DB

6
82
0

7F
8

0
a Differentiate between Syntactic ambiguity and Lexical Ambiguity. [5M]

D
8A

8C
1
8

E
29

73
B9
4

0A

2
b Define affixes. Explain the types of affixes. [5M]

06
B
A

82

E
8

D
F

D
F5

8C
1
c Describe open class words and closed class words in English with examples. [5M]

2E
29

B9
4
8

0A
37

06
FB
A

82
8
d What is rule base machine translation? [5M]

ED
E7

4D
F5

1
8
29

B9
8

A
Explain with suitable example following relationships between word meanings.

2
e [5M]
8C

37

84

6
FB
A

2
D0

10
ED
8
E7

F5

A
06

Homonymy, Polysemy, Synonymy, Antonymy

8
29

B9
4
8

0A

22
8C

7
91

84
FB
5A
3
f Explain perplexity of any language model. [5M]

ED
8
E7
DB

4D
A
06

8
9
7F

B8

0A
2

22
8C
91

4
2E

5A
73

8
9F

88
DB

4D
8A
06

Q.2 a) Explain the role of FSA in morphological analysis?


82

CE

7F

0A
2
91

4
FB
A8

2E

5A
Q.2 b) Explain Different stage involved in NLP process with suitable example. 73 [10M]

8
8
DB

4D
8A
06
82

9
E
D0

A2
C

37
91

84
FB
A8

2E
44

68

E7
DB

F5

8A
Q.3 a) Consider the following corpus [5M]
82

29
D0
A8

10

37

FB
A8

2E

5A
<s> I tell you to sleep and rest </s>
9
44

68
B8

E7
B
82

29
D0

7F
A8

10
ED
9F

<s> I would like to sleep for an hour </s>


C
A8

5A
73
9
44

68
B8
A2

DB

<s> Sleep helps one to relax </s>


82

E
D0

7F
A8

0
9F
F5

8C
91
A8

2E

List all possible bigrams. Compute conditional probabilities and predict

73
44
B8
A2
37

DB

06
2

CE
D0
A8

the next ord for the word “to”.


9F

8
E7

F5

1
A8

2E

B9
44

68
8
A2
8C

37

FB

2
D0
A8

10
D
8
E7

F5
06

A8

2E
29

Q.3 b) Explain Yarowsky bootstrapping approach of semi supervised learning [5M]


B9
4
B8
8C

37
91

84
A

2
0

ED

Q.3 c) What is POS tagging? Discuss various challenges faced by POS tagging. [10M]
9F

88
E7
DB

D
F5

A
06

44
B8

A
A2

22
C

37
91
2E

D0
8
68

9F

88
E7
DB

Q.4 a) What are the limitations of Hidden Markov Model? [5M]


7F
10

44
8

0A
A2
C

B
2E

73
B9

8
68

Q.4 b) Explain the different steps in text processing for Information Retrieval [5M]
9F

4D
5

A
82

CE

7F
10
ED

8
A2

Q.4 c) Compare top-down and bottom-up approach of parsing with example. [10M]
84
FB
A8

73
B9

8
2

8A
06
82

29
CE
D0

7F
ED

91

FB
8

5A
73
44

Q.5 a) What do you mean by word sense disambiguation (WSD)? Discuss dictionary based [10M]
8
0A

DB

06
82

29
CE

7F
4D

approach for WSD.


91
8

2E

5A
73
8
0A

DB
84

06

Q.5 b) Explain Hobbs algorithm for pronoun resolution. [10M]


82

CE

7F
4D
8A

91
A8

2E

73
8
DB
84

06
FB

82

CE
D0

Q.6 a) Explain Text summarization in detail. [10M]


8A

91
A8

2E
29

44

68
DB

Q.6 b) Explain Porter Stemming algorithm in detail


B

[10M]
82
D0
A8

10
9F

A8

2E

B9
44
B8
A2

82
D0
A8

ED
9F
5

A8
7F

44
B8
A2

2
82
73

D0

*****************
A8
9F
5

8
CE

7F

44
B8

0A
A2
73

A8
9F

4D
5
CE

7F

B8
A2

84
73
8

9F
5

8A
06

CE

7F

A2
91

16298
73
8

9F
DB

F5
06

CE

A2
37
91

E7
B

F5
06
ED

2EDB91068CE737F5A29FB8A844D0A882
8C

37
91
SECTION 1

1 Stage of NLP

LEXICAL ANALYSIS
SYNTACTIC ANALYSIS
SEMANTIC ANALYSIS
DISCLOSURE INTEGRATION
PRAGAMTIC ANALYSIS

A. LEXICAL ANALYSIS
• Lexical Analysis is the rst stage of NLP.
• It is also known as morphological analysis.
• It involves identifying and analyzing word structure
• It divides the whole text in paragraphs , sentence and words.
• When you apply lemmitization on a word it reduces the word into its most basic form or dicitonary
form which is a lemma , For example Lemma of a words running , ran and runs is run .

B SYNTACTIC ANALYSIS
• It is also known as parsing.
• Syntactic analysis is used to check Grammar , Word Arrangments and shows the relationship
between the words.
• For Example - Agra goes to Rutuja.
• In real world Agra goes to rutuja does not make any sense , So this sentence is rejected by
Syntactic analysis
• Dependency Grammar and Parts of Speech are important attribute tags of Syntactic anlayzer.

C Semantic Analysis
• Semantic Analysis is concerned with meaning representation.
• It mainly focus on literal meaning of words.
• Consider a sentence - Apples ate banana.
• Although the sentence is syntactically correct , it does not make any sense because apple cannot
eat banana.
• Semantic Analysis looks for given meaning in the given Sentence .

D Disclosure integration
• The meaning of the sentence depends on the meaning of the sentence just before it .
• In addition it also brings about the meaning immediately to its succeeding sentence
• In the text , Yash Patil is a bright student . He spends most of a time in a library. here disclosure
integration "he" refers to "Yash Patil".

E PRAGMATIC ANALYSIS

• It is the fth and last stage of NLP


• During this , what is said is re-interpreted on what it is actually meant.
• It involves deriving those aspect of language which requires real world knowledge .
• for example open the book is interpreted as a request instead of a order.
Ambiguity in NLP

- Ambiguity in NLP:
• Ambiguity is the capability of being understood in more than one way.
• Natural language is very ambiguous.
• Any sentences in a language with a large enough grammar can have another interpretation.
Types of Ambiguity:

1) Lexical Ambiguity:
➢ The ambiguity of a single word is called lexical ambiguity.
➢ For example, treating the word silver as a noun, an adjective, or a verb.
- She got a silver in the long jump.
- Her dress is trimmed with silver ribbon.
- The surface of the lake was silvered by moonlight

2) Syntactic Ambiguity:
➢ This kind of ambiguity occurs when a sentence is parsed in di erent ways.
➢ For example, the sentence “The man saw the girl with the telescope”.
➢ It is ambiguous whether the man saw the girl carrying a telescope or he saw her through his
telescope.

3) Semantic Ambiguity:
➢ This kind of ambiguity occurs when the meaning of the words themselves can be misinterpreted.
➢ In other words, semantic ambiguity happens when a sentence contains an ambiguous word or
phrase.
➢ For example, the sentence “The car hit the pole while it was moving” is having semantic ambiguity
because the interpretations can be “The car, while moving, hit the pole” and “The car hit the pole
while the pole was moving”.

4) Anaphoric Ambiguity:
➢ This kind of ambiguity arises due to the use of anaphora entities in discourse.
➢ For example, the horse ran up the hill. It was very steep. It soon got tired. Here, the anaphoric
reference of “it” in two situations cause ambiguity.

5) Pragmatic Ambiguity:
➢ Pragmatic ambiguity arises when the statement is not speci c. It is the most di cult ambiguity.
➢ For example, the sentence "I like you too" can have multiple interpretations like I like you just like
you like me), I like you just like someone else dose).
Section 2

1. Explain porter stemmer algorithm in detail .


or
Explain Porter Stemmer algorithm with rules.

You might also like