0% found this document useful (0 votes)
34 views4 pages

NLP QB

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
34 views4 pages

NLP QB

Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 4

MODULE-1

1)Define NLP. Explain why NLP & how NLP works.


2) List and explain different features which made NLP hard.
3) Give the difference between NLP and Computer language
4)Describe the impact of Zipf’s law.
5) Explain the basic operations of text processing.
6) List various applications of NLP and discuss any 2 applications in detail. Explain pre-
processing operation/steps in NLP.
7) What do you mean by ambiguity in Natural language? Explain with suitable example.
8) Explain Porter's stemming algorithm in detail Apply on following words to get the stems:
1.Tapping
2.Smiling
3.Computerization
9) Explain dynamic programming algorithm to find minimum edit distance.
10) Write a note on n-gram language model.
11) Explain Zipf’s law with example.
12) Explain Shennon visualization method.

MODULE-2
1) Write a note on Good-turing estimation.
2) Define Stem & Affixes. Explain types of affixes with example.
3) Explain morphological process in detail.
4) What is the role of FSA in morphological analysis? Explain FST in detail.
5) Explain first order Markov model with example.
6) Explain Viterbi algorithm for finding the best payh.
7) Define the features of maximum entropy model.
8) Explain different approaches in POS tagging.
9) For each feature fi , assume a weight λi . Now, write expression for the following
probabilities in terms of your model parameters
P(D|cat)
P(N|laughs)
P(D|man)
10) Explain the concept of first order Marcov model and apply the same to solve the given
weather model.
11) Explain the various key techniques used in Morphological Analysis. Apply the
Preprocessing, Stemming, and Lemmatization to the given text and specify the
results.text = "Graph-based text mining involves representing text data as a graph
and using graph algorithms to extract meaningful patterns."
12) Explain the various key techniques used in Morphological Analysis. Apply the
Preprocessing, Stemming, and Lemmatization to the given text and specify the
results.
text = "Graph-based text mining involves representing text data as a graph and
using graph algorithms to extract meaningful patterns."
13) Write a note on Good-turing estimation. take a corpus of 30000 English words, our
universe is all English words, our event, X, is the word “unusualness”. The word
“unusualness” appears once, so Nx = 1. In a reasonable corpus, you might have 10000
different words that appear once, so E(1) = 10000, and you might have 3000 words that
appear twice, giving E(2) = 3000. Identify Good turing estimate for the word
“unusualness”
14) Explain the various types of POS tagging. Apply the POS tagging to the given text: “It
took me more than two hours to translate a few pages of English.”

MODULE-3
1) Explain the use of CFG in Natural Language Processing with suitable example
2) Explain Top-down & Bottom-up approach of parsing with suitable example.
3) Construct a parse tree for the following sentence using CFG rules: The man read this
book
Rules:
S->NP VP
S->VP
NP->Det Nom
Nom->Noun
VP->verb NP
Verb->read
Det->the , this
Noun->book, man
Verb->book, read

4) Compare Top-down & Bottom-up approach of parsing with suitable example.


5) Write a note on Probabilistic context free grammar with example.
6) Write a note on inside-outside probabilities.
7) With an example describe the following:
a) Computing Maximum likelihood
b) Model Parameter Estimation
8) Apply CKY algorithm for the following grammar
S → NP VP
NP → D N | Pro | PropN
D → PosPro | Art | NP ’s
VP → Vi | Vt NP | Vp NP VP
Pro → i | we | you | he | she | him | her
PosPro → my | our | your | his | her
PropN → Robin | Jo
Art → a | an | the
N → cat | dog | duck | saw | park | telescope | bench
Vi → sleep | run | duck
Vt → eat | break | see | saw
Vp → see | saw | heard
9) Explain Dependency graphs with formal conditions. Identify the dependency graph to the
given text “I prefer the morning flight through Denver.”
10) Apply CKY algorithm for following CFG and check if baaba is in L(G)
S->AB | BC
A->BA | a
B->CC | b
C->AB | a

MODULE-4
1) Explain pointwise Mutual Information (PMI) with issues and Variations.
2) Illustrate Lexical semantics with example.
3) Write a note on Homonymy polysemy with examples.
4) Explain Distributional algorithms and Thesaurus-based algorithms with respect
wordnet.
5) Explain decision list algorithm.
6) Describe Lesk’s and Walker’s algorithm.
7) Explain Generative model for Latent Dirichlet Allocation.
8) Explain Gibb’s sampling.
9) Describe Correlated Topic Model (CTM).
10) Illustrate the steps to implement Gibbs Sampling Algorithm. Identify the Join
Probability if we have two variables (X, Y) which takes two values 0 and 1, we want to
sample from their joint distribution p(X, Y) which is defined as follows:
p(X=0, Y=0) = 0.2
p(X=1, Y=0) = 0.3
p(X=0, Y=1) = 0.1
p(X=1, Y=1) = 0.4

MODULE-5
1) Write a note on Information Extraction.
2) Briefly explain different approaches in Relation extract.
3) Explain how Evaluation of Supervised Relation Extraction can be done.
4) Explain the different stages of Text summarization.
5) Write a note on Unsupervised content selection.
6) Write a note on Integer Linear Programming (ILP).
7) Explain Bayes’ rule for documents and classes.
8) Appy Naïve baye’s classifier for the following example.

9) Define Sentiment Analysis. Describe how sentiment analysis can be used in movie
review.
10) Apply Naïve baye’s classifier for the following example.

11) Identify and explain the steps in Learning Sentiment Lexicons.

You might also like