Unit II
Unit II
states that the problem can be in. Furthermore, in search algorithms, we use a state
space to represent the current state of the problem, the initial state, and the goal state.
Additionally, we represent each state in the state space by a set of variables.
State space search is a method used widely in artificial intelligence and computer
science to find a solution to a problem by searching through the set of possible states
of the problem. Furthermore, a state space search algorithm uses the state space to
navigate from the initial state to the goal state.
Depth-First Search (DFS) and Breadth-First Search (BFS) are two fundamental graph
traversal algorithms, each with its own set of merits and use cases:
3. **Useful for Pathfinding:** DFS can be adapted for pathfinding algorithms like
finding all paths between two nodes, topological sorting, and cycle detection.
4. **Space Complexity:** In the worst case, DFS can have a high space complexity if
the graph is very deep, potentially leading to a stack overflow.
3. **Memory Intensive:** BFS typically uses more memory than DFS because it
needs to maintain a queue to store nodes at each level.
4. **Useful for Shortest Path and Connectivity:** BFS is suitable for finding the
shortest path, connected components, and can be useful in algorithms like Dijkstra's
for weighted graphs.
Processing of Natural Language is required when you want an intelligent system like
robot to perform as per your instructions, when you want to hear decision from a
dialogue based clinical expert system, etc.
- **Functionality:** TNs are primarily used for syntactic parsing, where each node
represents a grammatical category, and transitions represent allowable transitions
between categories.
- **Limitation:** Transition Networks are not very expressive and are typically
used for simple syntactic parsing tasks.
- **Use Cases:** RTNs have been used in the past for parsing natural language
with more complex syntactic structures.
- **Use Cases:** ATNs have been used in various NLP applications, including
natural language understanding, dialogue systems, and machine translation.
Parsing is the process of analyzing a sentence based on a given grammar to determine
its syntactic structure. In this case, we will use top-down parsing to analyze the
sentence "Tom ate an apple" based on the provided grammar rules:
Grammar Rules:
1. sentence → noun phrase, verb phrase
2. noun phrase → proper noun
3. noun phrase → determiner, noun
4. verb phrase → verb, noun_phrase
5. proper noun → Tom
6. noun → [apple]
7. verb → [ate]
8. determiner → [an]
Now, let's parse the sentence "Tom ate an apple" using top-down parsing:
The entire sentence "Tom ate an apple" has been successfully parsed based on the
provided grammar, and it is valid according to the grammar rules.
1. **Performance Measure:** Define how the success of the task will be measured.
This could be a quantitative measure or a subjective evaluation.
4. **Sensors:** Specify the sensors or means through which the agent perceives the
environment and gathers information.
5. **Goals:** Define the goals and objectives of the task. What does the agent need
to achieve or accomplish?
In A* search algorithm, we use search heuristic as well as the cost to reach the node.
Hence we can combine both costs as following, and this sum is called as a fitness
number.
The first phase of NLP is the Lexical Analysis. This phase scans the source code as a
stream of characters and converts it into meaningful lexemes. It divides the whole text
into paragraphs, sentences, and words.
Syntactic Analysis is used to check grammar, word arrangements, and shows the
relationship among the words.
In the real world, Agra goes to the Poonam, does not make any sense, so this sentence
is rejected by the Syntactic analyzer.
3. Semantic Analysis
4. Discourse Integration
Discourse Integration depends upon the sentences that proceeds it and also invokes
the meaning of the sentences that follow it.
5. Pragmatic Analysis
Pragmatic is the fifth and last phase of NLP. It helps you to discover the intended
effect by applying a set of rules that characterize cooperative dialogues.
UNIT II
NLP stands for Natural Language Processing, which is a part of Computer Science,
Human language, and Artificial Intelligence. It is the technology that is used by
machines to understand, analyse, manipulate, and interpret human's languages. It
helps developers to organize knowledge for performing tasks such as translation,
automatic summarization, Named Entity Recognition (NER), speech recognition,
relationship extraction, and topic segmentation.
**Transformational Grammars:**
- Transformational grammars, introduced by Noam Chomsky, focus on
transformations applied to sentences to generate different grammatical forms.
- These grammars are part of generative grammar theory and played a role in early AI
research.
- Transformational rules describe how sentences can be transformed into equivalent or
related sentences, aiding in language generation and transformation tasks.
1. **Case Grammar:**
- Case Grammar, developed by Fillmore in the 1960s, focuses on the roles that
words play in sentences and their relationships to one another.
- It introduces the concept of "cases" to represent the grammatical functions of
words within a sentence, such as the agent, patient, and instrument.
- Case Grammar aims to provide a deeper understanding of sentence structure and
meaning by emphasizing the roles played by words in context.
2. **Frame Semantics:**
- Frame Semantics is another linguistic theory developed by Fillmore, emphasizing
the importance of semantic frames or structured mental representations.
- It proposes that words and phrases are associated with frames, which are
knowledge structures that capture the meaning and context of linguistic expressions.
- Frame Semantics is particularly relevant in understanding how words and phrases
derive their meanings from the situations or contexts in which they are used.
**Sentence Generation:**
- Sentence generation involves creating coherent and meaningful sentences using
computer programs or models.
- It can be used in chatbots, content creation, and automatic report generation.
- Techniques like template-based generation, rule-based generation, and neural
language models are used.
- The goal is to produce text that sounds natural and contextually relevant.
- Sentence generation is essential in natural language processing and artificial
intelligence applications.
**Translation:**
- Translation is the process of converting text from one language to another.
- Machine translation uses algorithms and models to perform this task.
- It helps people understand and communicate in different languages.
- Popular examples include Google Translate.
- Quality varies depending on language pairs and context, with neural machine
translation offering significant improvements in recent years.