0% found this document useful (0 votes)
172 views

Assignment 8

This document contains a 10 question multiple choice quiz on natural language processing topics. The questions cover lexical relations, word sense disambiguation algorithms, word similarity measures and word taxonomy. Example solutions and explanations are provided for each question.

Uploaded by

geetha megharaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
172 views

Assignment 8

This document contains a 10 question multiple choice quiz on natural language processing topics. The questions cover lexical relations, word sense disambiguation algorithms, word similarity measures and word taxonomy. Example solutions and explanations are provided for each question.

Uploaded by

geetha megharaj
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

Natural Language Processing

Assignment- 8
TYPE OF QUESTION: MCQ
Number of questions: 10 Total mark: 10 X 1 = 10

Question 1. Consider the following sentences:

1. The statues were cast in clay moulds.


2. The Indian caste system is an example of social stratification.
3. My dog would always bark at strangers.
4. The tree’s bark has medicinal value.
5. The newspaper was delivered early today.
6. The newspaper recruited a new investigative journalist.

The lexical relation between the highlighted words in sentences 1&2, 3&4, and 5&6 are:

a. Homonymy, Homonymy, Homonymy


b. Homophony, Homonymy, Polysemy
c. Homonymy, Homonymy, Homonymy
d. Homonymy, Homography, Polysemy

Answer: b, d
Solution: Refer to Week 6 Lecture 1.

Question 2. Consider the sentence: “Although the buttons did not look good on the shirt,
the dress fit well”. Which of the following are true?

a. “dress” subsumes “button”.


b. “shirt” is a hyponym of “button”.
c. “dress” is a hypernym of “shirt”.
d. “button” is a meronym of “shirt”.

Answer: c, d
Solution: Refer to Week 6 Lecture 1.

For Questions 3 to 5, consider a hypothetical wordnet noun taxonomy with their


information content as shown in Figure 1.
Figure 1

Question 3. What is the Resnik similarity between “Car” and “Motor Vehicle”?

a. 0.948
b. 8.30
c. 9.53
d. 10.57

Answer: c
Solution: simResnik(c1, c2) = IC(LCS(c1, c2)) = −log P(LCS(c1, c2)) = 9.53

Question 4. What is the Lin similarity between “Cycle” and “Article”?

a. 0.226
b. 0.452
c. 8.55
d. None of the above

Answer: b
Solution: simLin(c1, c2) = 2*log P(LCS(c1, c2)) / (log P(c1) + log P(c2)) = 2*3.53/(10.35 + 5.26)

Question 5. What is the Jiang-Conrath distance between “Object” and “Cutlery”?

a. 6.93
b. 0.446
c. 0.144
d. None of the above

Answer: a
Solution: disJC(c1, c2) = IC(c1) + IC(c2) - (2 * IC(LCS(c1, c2)) = 2.79 + 9.72 - (2 * 2.79)
For Questions 6-8, use NLTK corpus reader WordNet. Check this link for reference:
https://www.nltk.org/howto/wordnet.html

Question 6. What is the Lowest Common Hypernym between the first sense of “man” and
the second sense of “whale”? What is its maximum depth from the root?

a. organism, 5
b. organism.n.01, 5
c. living_thing.n.01, 4
d. whole.n.02, 4

Answer: b
Solution:
from nltk.corpus import wordnet as wn
man = wn.synset("man.n.01")
whale = wn.synset("whale.n.02")
print(man.lowest_common_hypernyms(whale)[0])
print(man.lowest_common_hypernyms(whale)[0].max_depth())

Question 7. What are the Leacock-Chodorow, and Wu-Palmer similarities between the
first senses of the words “firm” and “institution”?

a. 3.907, 0.592
b. 0.592, 3.907
c. 0.714. 2.028
d. 2.028, 0.714

Answer: d
Solution:
from nltk.corpus import wordnet as wn
firm = wn.synsets("firm")[0]
institution = wn.synsets("institution")[0]
print(firm.lch_similarity(institution))
print(firm.wup_similarity(institution))

Question 8. Consider the sentences:

● Heat the solution to 75° Celsius.


● She provided a brilliant solution to the long-standing problem.

What are the definitions corresponding to the senses of the word "solution", as per the
Lesk algorithm, when using senses from Wordnet?
1. a method for solving a problem.
2. a statement that solves a problem or explains how to solve the problem.
3. a homogeneous mixture of two or more substances; frequently (but not
necessarily) a liquid solution.
4. the set of values that give a true statement when substituted into an equation.

a. 3, 2
b. 3, 1
c. 2, 2
d. 2, 4

Answer: c
Solution: Refer http://www.nltk.org/howto/wsd.html
from nltk.corpus import wordnet as wn
from nltk.wsd import lesk
for l in wn.synsets('solution'):
print(l.name(), l.definition())
print(lesk('Heat the solution to 75° Celsius.'.split(), 'solution'))
print(lesk('She provided a brilliant solution to the long-standing
problem.'.split(), 'solution'))

Question 9. Consider the co-occurrence matrix below:

NLP ~ NLP Total

Deep Learning 346 568 914

~ Deep Learning 874 1420 2294

Total 1220 1988 3208

Hence, “NLP” and “Deep Learning” have co-occurred 346 times. According to the HyperLex
algorithm, what is the distance between “NLP” and “Deep Learning”?

a. 0.379
b. 0.621
c. 0.284
d. 0.716

Answer: b
Solution: P (NLP | Deep Learning) = 346 / 914
P (Deep Learning | NLP) = 346 / 1220
DisHyperLex(“NLP”, “Deep Learning”) = 1 - max {P(NLP | Deep Learning), P(Deep Learning | NLP)}
= 1 - 0.379 = 0.621
Question 10. Which of the following are true?

a. Yarowsky’s method, a minimally supervised word-sense disambiguation algorithm, uses


"one sense per collocation" and the "multiple senses per discourse" properties for WSD.
b. HyperLex is an unsupervised clustering-based word-sense disambiguation algorithm that
extracts the word senses from the corpus itself.
c. In the Decision List algorithm, collocations are ordered in a decision list, with least
predictive collocations ranked highest.
d. Precision, and Recall are important measures to evaluate WSD systems.

Answer: b, d
Solution: Refer to this paper: https://www.ling.upenn.edu/courses/ling052/Navigli2009.pdf

You might also like