NLP Sample Questions-Stu
NLP Sample Questions-Stu
SAMPLE QUESTIONS
MODULE-1
1. What is the main objective of Natural Language Processing (NLP)? Name any two applications of
NLP. (2m) (L1)
2. Explain the working of HAL 9000 computer in understanding human language. (5m) (L2)
3. Why the sentence “I made her duck” is ambiguous? (5m) (L1)
4. Define ‘regular expression’. Match the following regular expressions with suitable strings.
(Note down the matching pairs) (5m) (L1)
(a) /!/ (w) “look up ^ now”
(b) /[wW]oodchuck/ (x) “plenty of 7 to 5”
(c) /[0-9]/ (y) “Oh my God!”
(d) /e^/ (z) “Woodchuck”
5. Define Kleene * operator and give an example. (2m) (L1)
6. What is the difference between Kleene * and Kleene + operators? (2m) (L1)
7. Illustrate any two Anchors in regular expressions with examples. (2m) (L2)
8. How do you specify the two strings “puppy” and “puppies” using a single regular expression?
(2m) (L1)
9. Demonstrate (i) Inflected form of words and (ii) Code switching with examples. (5m) (L2)
10. Illustrate ‘word tokenization’ with examples. (5m) (L2)
11. Explain Byte-Pair Encoding process for word tokenization by writing the algorithm and applying
it on the following corpus: (10m) (L2)
1
20. How do you compute the probability of the sentence “I want English food” given the following
probability values? (2m) (L1)
P(I | <s>) = 0.25 P(want | I) = 0.33
P(English | want) = 0.0011 P(food | English) = 0.5
P(</s> | food) = 0.68
21. What are the two methods used for evaluating a language model? (2m) (L1)
22. Define perplexity. (2m) (L1)
23. Explain zero-probability bigrams with an example. (2m) (L2)
24. Explain smoothing and its two techniques Laplace smoothing and Add-k smoothing. (10m) (L2)
25. Explain the smoothing techniques: (i) Backoff, (ii) Interpolation and (iii) Katz backoff. (6m) (L2)
MODULE-2
1. Illustrate the eight parts of speech in English with an example for each. (4m) (L2)
2. Explain (i) Closed classes and Open classes, (ii) Count nouns and Mass nouns and (iii) the four
types of Adverbs with examples. (6m) (L2)
3. Explain (i) Particle, (ii) Determiner (iii) Auxiliary verb and (iv) Politeness markers with examples.
(4m) (L2)
4. Define any ten tags from the Penn Treebank tagset (provide tag name, description and an
example). (5m) (L1)
5. Label each word in the following sentences with proper tags using Penn Treebank tagset. (i) The
grand jury commented and (ii) There are 70 children there. (4m) (L1)
6. Find one tagging error in each of the following sentences that are tagged with the Penn
Treebank tagset: (i) I/PRP need/VBP a/DT flight/NN from/IN Atlanta/NN and (ii) Does/VBZ
this/DT flight/NN serve/VB dinner/NNS. (4m) (L1)
7. Define any ten tags from the Brown corpus tagset (provide tag name, description and an
example). (5m) (L1)
8. What are the two types of tagging algorithms? (2m) (L1)
9. Demonstrate HMM part-of-speech tagging. Illustrate with the help of the sentence ‘Secretariat
is expected to race tomorrow’. (10m) (L2)
10. Demonstrate transformation-based tagging. (5m) (L2)
11. Explain the evaluation of error using 10-fold crossvalidation and analyzing the error using a
confusion matrix in POS tagging. (4m) (L2)
12. Outline (i) tag indeterminacy, (ii) tokenization and (iii) unknown words. (6m) (L2)
13. How do you tag the sentences (i) It is a nice night and (ii) I like to watch French movies using
Penn Treebank tagset? (4m) (L1)
MODULE-3
2
7. What are the two types of structural ambiguity? Give examples. (2m) (L1)
8. Apply the CKY parsing algorithm on the sentence ‘the flight includes a meal’ by considering the
following grammar with explanation of conversion to Chomsky normal form (CNF): (10m) (L3)
S NP VP Det the N meal
NP Det N Det a N flight
VP V NP V includes
9. Apply the Earley parsing algorithm on the sentence ‘Book that flight’ by considering the
following grammar: (10m) (L3)
Det that | this | a | the
Noun book | flight | meal | money
Verb book | include | prefer
S NP VP
S VP
NP Det Nominal
Nominal Noun
VP Verb
VP Verb NP
10. Explain Probabilistic Context-free Grammars (PCFGs) by considering the sentence ‘astronomers
saw stars with ears’ and the following grammar: (8m) (L2)
S NP VP (1.0) NP NP PP (0.4)
PP P NP (1.0) NP astronomers (0.1)
VP V NP (0.7) NP ears (0.18)
VP VP PP (0.3) NP saw (0.04)
P with (1.0) NP stars (0.18)
V saw (1.0) NP telescopes (0.1)
MODULE-4
3
13. Illustrate any two methods to visualize embeddings. (2m) (L2)
14. Explain semantic properties of embeddings. (8m) (L2)
****
MODULE-5
****