Mid-Term: Answer Point Value: 1.0 Points Answer Key: D

Download as pdf or txt
Download as pdf or txt
You are on page 1of 12

Name: _________________________

Score: ______ / ______

Mid-Term

Part 1: Part-1

4
The linguists phenomena in which the same word appears in different word forms
(POS category) are called
A. Hapology

B. Heterograph

C. Homology

D. Homograph

Answer Point Value: 1.0 points


Answer Key: D

11
Which among the following is NOT coming under the scope of NLP application
A. Automatic grammar correction

B. Human to computer interaction

C. Creating new artificial language

D. Customer care chat-bots

E. Identifying errors in computer program

F. Extracting data from JSON file

Answer Point Value: 1.0 points


Answer Key: C,E,F
7
What would be the total number of tokens for the following sentence after word
level tokenization and character level tokenization: "All good things come to an
end."
A. 8,25

B. 8, 31

C. 7, 25

D. 7,31

Answer Point Value: 1.0 points


Answer Key: D

10
Which of the following can be used to effectively represent morphological
rules[CO1,L2,1]
A. Directed Acyclic graphs

B. Finite state transducers

C. Context free grammars

D. If-else rule

Answer Point Value: 1.0 points


Answer Key: B

8
FST is [CO1,L2,1]
A. Mealy machine

B. Moore machine

C. Push-down automaton

D. Turing machine

Answer Point Value: 1.0 points


Answer Key: A
2
Which among the following will be the correct lexical representation of the
surface form "dog" [CO1,L2,1]
A. dog+V+Present

B. dog+N+PL

C. dog+N+SG

D. dog+VB+SG

Answer Point Value: 1.0 points


Answer Key: C

1
Knowledge of the relationship of meaning to the goals and intentions of the
speaker is [CO1,L2,1]
A. Morphology

B. Pragmatics

C. Discourse

D. Semantics

Answer Point Value: 1.0 points


Answer Key: C

5
The word "upcoming" is [CO1,L2,1]
A. a derivational morphology and contains 3 morphemes

B. both inflection and derivational morphology, contains 3 morphemes

C. an inflectional morphology and contains 2 morphemes

D. an inflectional morphology and contains 2 morphemes

Answer Point Value: 1.0 points


Answer Key: B

6
Which of the following is an example for closed word class
A. Nouns

B. Interrogative words

C. Verb

D. Proper nouns

Answer Point Value: 1.0 points


Answer Key: B
3
Computerize, playing, Computation are examples for
A. Derivational, inflectional, inflectional

B. Derivational, Derivational, inflectional

C. Derivational, inflectional, Derivational

D. Inflectional, Inflectional, Derivational

Answer Point Value: 1.0 points


Answer Key: A

9
What is the main challenge in morphological analysis of Indian languages
[CO1,L2,1]
A. Removing stopwords

B. Multiple affixes

C. Removing punctuations

D. Sandhi split

Answer Point Value: 1.0 points


Answer Key: D
Part 2: Part-2

13
Assume that the edit distance algorithm uses following costs: insertion-1,
deletion-1 and substitution-2. Now calculate the distance between the strings
"cookery" and "bakery", number of insertions, deletions and substitutions
A. 5, 2, 1,1

B. 5, 0, 1, 2

C. 3, 1, 0, 1

D. 3, 0, 1, 1

Answer Point Value: 2.0 points


Answer Key: B

12
Assume that the edit distance algorithm uses following costs: insertion-1,
deletion-1 and substitution-1. Now calculate the distance between the strings
"saturday" and "sunday", number of insertions, deletions and substitutions
A. 3, 1, 1,1

B. 3, 0, 2, 1

C. 4, 2, 0, 1

D. 5, 1, 2, 0

Answer Point Value: 2.0 points


Answer Key: B
Part 3: Part-3

18
Consider the following setence: "Ram went to by fruits in the supermarket of the
mall, while his younger brother went to play games in the gaming section of the
mall. What is the probability of P(to|went)?
A. 0.5

B. 0

C. 1

D. 0.001

Answer Point Value: 1.0 points


Answer Key: C

22
Which among the following is correct in case of a bigram model [CO2, L2,2]
A. P(A,B,C,D) = P(A)P(B|A)P(C|B,A)P(D|A,B,C)

B. P(A,B,C,D) = P(A)P(B|A)P(C|B)P(D|C)

C. P(A,B,C,D) = P(A)P(B)P(C)P(D)

D. P(A,B,C,D) = P(A)P(B)P(C|A)P(D|B)

Answer Point Value: 1.0 points


Answer Key: B

16
If we are assuming that likelihood of particular tag follows a trigram model, which
among the following is correct [CO2, L2, 2]
A. P(ti|t1...tn) = P(ti|ti-1)

B. P(ti|t1...ti-1) = P(ti|ti-1,t-2)

C. P(ti|t1...tn)= P(ti|ti-1,ti-2)

D. P(ti|ti...tn)=P(ti|ti-1)

Answer Point Value: 1.0 points


Answer Key: B
15
Consider the following statements and select the correct option: Assertion (A): It
is not necessary to use the denominator (common for all classes) while
computing probabilities using Naïve Bayes formula Reason (R): In Naïve Bayes, a
reasonably accurate rank ordering of probability values is required for classifying
a new record [CO2, L2, 1]
A. A is incorrect and R is correct

B. A is correct and R is incorrect

C. Both A and R are correct but R is not the correct explanation of A

D. Both A and R are correct and R is the correct explanation of A

Answer Point Value: 1.0 points


Answer Key: D

20
Which among the following is a rulebased POS tagger [CO2, L2, 1]
A. HMM tagger

B. EngCG tagger

C. Pen Tagger

D. Brill Tagger

Answer Point Value: 1.0 points


Answer Key: B

21
Perplexity is inversly propotional to probability [CO2, L2, 1]
A. True

B. False

Answer Point Value: 1.0 points


Answer Key: A
19
Consider the following corpus containig 3 sentences. (1) Alice went to the cafe
(2)Bob was waiting for Alice (3) Alice and Bob went to the museum. Assume we
are not applying any preprocessing and stopword removal. What is the total
number of unique bigrams for which the likelihood will be estimated? [CO2, L2, 2]
A. 17

B. 15

C. 18

D. 16

Answer Point Value: 1.0 points


Answer Key: A

14
Viterbi algorithm uses [CO2, L2, 1]
A. Dynamic Programming

B. Greedy approach

C. Divide and conquer

D. Recursion

Answer Point Value: 1.0 points


Answer Key: A

17
Which among the following is an extrinsic measure for language model
evaluation? [CO2, L2, 1]
A. Perplexity

B. Cross entropy

C. Accuracy

D. Probability

Answer Point Value: 1.0 points


Answer Key: C
Part 4: Part-4

24
Imagine that you start reading a poem and you have already read 10 times "is", 3
times "love", 2 times "end", 1 times "that", 1 times "kind" and 1 times "enough".
Total 18 tokens are read so far. How likely is it when the next token is "kind" by
using MLE? How likely is it when the next token is "hate" by MLE
A. 1/18, 1/18

B. 1/9, 0

C. 1/18, 0

D. 1/9, 1/6

Answer Point Value: 2.0 points


Answer Key: C
25
Suppose you are in room quarantine due to primary contact of a covid patient.
You are locked in a room and you want to find out how the weather outside only
based on how you feel. Assume a first order hidden markove model with the
states {sunny, foggy, rainy} with their transition probabilities as given in the
figure-1
(https://drive.google.com/file/d/1TwrLVt6ZvXRDLagcpJWwf5uS7K7AYKIl/view?usp
=sharing).
You either feel happy or grumpy. The emission probabilities of how you feel given
the weather outside are shown below  figure-2
(https://drive.google.com/file/d/1Ew6_JuQG_8b1J-
4cInxC1p8b5PD2w3yh/view?usp=sharing).
Let the initial probabilities of the weather being sunny, foggy and rainy on day 1
be 0.5,0.3 and 0.2 respectively.
Suppose you feel grumpy on day-1, happy on day-2 and happy again on day-3.
What is the probability of all three days being sunny? [CO2, L2, 2]

Attachments
A.
0.02016

B.
0.07526

C.
0.10976

D.
0.04704

Answer Point Value: 2.0 points


Answer Key: A
23
Suppose you are reading an article on IPL cricket. Till now you have read the
words "score", "cricket", and "India" 5 times each, "catch",and "wicket" thrice
each, "points", "company", "ground" and "won" once each. What is the maximum
likelyhood estimate and add-one smoothned probabilities of reading "batsman"
as the next word? [CO2, L3, 2]
A. 0/25, 1/35

B. 4/25, 4/25

C. 0/25,1/26

D. 0/25, 0/25

Answer Point Value: 2.0 points


Answer Key: A

You might also like