Mid-Term: Answer Point Value: 1.0 Points Answer Key: D
Mid-Term: Answer Point Value: 1.0 Points Answer Key: D
Mid-Term: Answer Point Value: 1.0 Points Answer Key: D
Mid-Term
Part 1: Part-1
4
The linguists phenomena in which the same word appears in different word forms
(POS category) are called
A. Hapology
B. Heterograph
C. Homology
D. Homograph
11
Which among the following is NOT coming under the scope of NLP application
A. Automatic grammar correction
B. 8, 31
C. 7, 25
D. 7,31
10
Which of the following can be used to effectively represent morphological
rules[CO1,L2,1]
A. Directed Acyclic graphs
D. If-else rule
8
FST is [CO1,L2,1]
A. Mealy machine
B. Moore machine
C. Push-down automaton
D. Turing machine
B. dog+N+PL
C. dog+N+SG
D. dog+VB+SG
1
Knowledge of the relationship of meaning to the goals and intentions of the
speaker is [CO1,L2,1]
A. Morphology
B. Pragmatics
C. Discourse
D. Semantics
5
The word "upcoming" is [CO1,L2,1]
A. a derivational morphology and contains 3 morphemes
6
Which of the following is an example for closed word class
A. Nouns
B. Interrogative words
C. Verb
D. Proper nouns
9
What is the main challenge in morphological analysis of Indian languages
[CO1,L2,1]
A. Removing stopwords
B. Multiple affixes
C. Removing punctuations
D. Sandhi split
13
Assume that the edit distance algorithm uses following costs: insertion-1,
deletion-1 and substitution-2. Now calculate the distance between the strings
"cookery" and "bakery", number of insertions, deletions and substitutions
A. 5, 2, 1,1
B. 5, 0, 1, 2
C. 3, 1, 0, 1
D. 3, 0, 1, 1
12
Assume that the edit distance algorithm uses following costs: insertion-1,
deletion-1 and substitution-1. Now calculate the distance between the strings
"saturday" and "sunday", number of insertions, deletions and substitutions
A. 3, 1, 1,1
B. 3, 0, 2, 1
C. 4, 2, 0, 1
D. 5, 1, 2, 0
18
Consider the following setence: "Ram went to by fruits in the supermarket of the
mall, while his younger brother went to play games in the gaming section of the
mall. What is the probability of P(to|went)?
A. 0.5
B. 0
C. 1
D. 0.001
22
Which among the following is correct in case of a bigram model [CO2, L2,2]
A. P(A,B,C,D) = P(A)P(B|A)P(C|B,A)P(D|A,B,C)
B. P(A,B,C,D) = P(A)P(B|A)P(C|B)P(D|C)
C. P(A,B,C,D) = P(A)P(B)P(C)P(D)
D. P(A,B,C,D) = P(A)P(B)P(C|A)P(D|B)
16
If we are assuming that likelihood of particular tag follows a trigram model, which
among the following is correct [CO2, L2, 2]
A. P(ti|t1...tn) = P(ti|ti-1)
B. P(ti|t1...ti-1) = P(ti|ti-1,t-2)
C. P(ti|t1...tn)= P(ti|ti-1,ti-2)
D. P(ti|ti...tn)=P(ti|ti-1)
20
Which among the following is a rulebased POS tagger [CO2, L2, 1]
A. HMM tagger
B. EngCG tagger
C. Pen Tagger
D. Brill Tagger
21
Perplexity is inversly propotional to probability [CO2, L2, 1]
A. True
B. False
B. 15
C. 18
D. 16
14
Viterbi algorithm uses [CO2, L2, 1]
A. Dynamic Programming
B. Greedy approach
D. Recursion
17
Which among the following is an extrinsic measure for language model
evaluation? [CO2, L2, 1]
A. Perplexity
B. Cross entropy
C. Accuracy
D. Probability
24
Imagine that you start reading a poem and you have already read 10 times "is", 3
times "love", 2 times "end", 1 times "that", 1 times "kind" and 1 times "enough".
Total 18 tokens are read so far. How likely is it when the next token is "kind" by
using MLE? How likely is it when the next token is "hate" by MLE
A. 1/18, 1/18
B. 1/9, 0
C. 1/18, 0
D. 1/9, 1/6
Attachments
A.
0.02016
B.
0.07526
C.
0.10976
D.
0.04704
B. 4/25, 4/25
C. 0/25,1/26
D. 0/25, 0/25