Assignment-I
Assignment-I
Set 2
Set 3
9. What are feedforward neural networks, and how do they differ from traditional
machine learning models?
10. Differentiate between unigrams, bigrams, and trigrams with appropriate examples.
11. Describe the encoder-decoder architecture in NLP and its typical applications.
12. Explain the concept of TF-IDF and its role in weighing terms in a document.
Set 4
13. What is CKY parsing? Outline its key steps and applications in syntactic analysis.
14. How can N-gram models be used for autocomplete features in modern applications?
15. Compare feedforward neural networks with RNNs in the context of NLP applications.
16. What are Recurrent Neural Networks (RNNs), and how do they function in language
modeling?
Set 5
17. Write a regular expression to identify dates in the format "DD-MM-YYYY".
18. How does the training process of a neural network differ when applied to NLP tasks
compared to other domains?
19. Provide an example of how the Edit Distance algorithm can correct spelling errors.
20. Describe the process of smoothing in N-gram models and explain why it is necessary.
Set 6
21. Explain the process of training a neural network for text classification.
22. What are some real-world applications of neural language models in modern
technology?
23. What is Part-of-Speech (POS) tagging, and why is it important in NLP?
24. How do activation functions influence the performance of neural networks in NLP?
Set 7
25. Compare and contrast the use of TF-IDF and Word2Vec in text analysis.
26. How are RNNs utilized for tasks beyond language modeling in NLP?
27. Define vector semantics and explain how words are represented as vectors in NLP.
28. What is the role of attention mechanisms in neural language models?
Set 8
29. Discuss the vanishing gradient problem in RNNs and potential solutions.
30. How are feedforward neural networks applied in language modeling?
31. What is smoothing in N-gram models, and why is it important?
32. Explain how cosine similarity helps in measuring document similarity.
Set 9
33. What are N-gram Language Models? How do they estimate the probability of word
sequences?
34. How do neural networks handle the representation of context in natural language?
35. Explain the concept of embeddings in neural networks and their role in representing
words and phrases.
36. What are the limitations of feedforward networks for language modeling, and how
do RNNs address these?
Set 10
37. Describe the structure of words, corpora, and their role in NLP tasks.
38. Explain the concept of word embeddings in neural networks and their advantages
over traditional methods.
39. Provide a detailed explanation of how an RNN processes sequential data in text.
40. Discuss the challenges of lexical and structural ambiguity in NLP, providing examples.