Ai 2
Ai 2
1. Knowledge refers to the body of information, understanding, and skills acquired through
experience or education.
2. Knowledge can be represented in machines using various techniques such as rules, frames,
semantic networks, logic, and ontologies.
- Structure: The knowledge should be organized in a structured manner for efficient storage and
retrieval.
- Methods: There should be appropriate methods for reasoning, inference, and manipulation of
the represented knowledge.
- Size: The representation should be compact and efficient to handle large amounts of
knowledge.
4. First-Order Predicate Logic (FOPL) can represent knowledge using predicates, constants,
variables, logical connectives, and quantifiers. For example, the statement "All humans are
mortal" can be represented as ∀x (Human(x) → Mortal(x)).
5. Predicates are relations or properties that can be either true or false for a given set of
arguments.
7. Total-Order Planners consider all possible sequences of actions to find a solution, while
Partial-Order Planners represent only the constraints between actions, allowing more flexibility
in the order of actions.
- Variables (e.g., x, y, z)
- Logical connectives (e.g., ∧ (and), ∨ (or), ¬ (not), → (implies), ↔ (if and only if))
- Quantifiers (e.g., ∀ (for all), ∃ (there exists))
```
TT T F T T
TF F T F T
FT T F T T
FF T T T T
```
Since the final column is all True, the statement (p → q) → (¬q → ¬p) is valid (a tautology).
10. The sentence "A Computer system is intelligent if it can perform a task which, if performed
by a human, requires intelligence" can be represented in Predicate Logic as:
11. Given:
To infer that "Socrates is mortal," we can use the rule of Universal Instantiation and Modus
Ponens:
2) Man(Socrates) [Premise]
12. Quantifiers are logical operators used in Predicate Logic to indicate the scope or generality
of a statement. The two types of quantifiers are:
- Universal quantifier (∀): Indicates that a statement holds true for all instances of a variable. For
example, ∀x (Human(x) → Mortal(x)) means "For all x, if x is a human, then x is mortal."
- Existential quantifier (∃): Indicates that a statement holds true for at least one instance of a
variable. For example, ∃x (Human(x) ∧ Wise(x)) means "There exists an x such that x is a human
and x is wise."
∃x (Boy(x) ∧ PlaysCricket(x))
14. First-Order Logic (FOL) is a formal system used to represent and reason about objects, their
properties, and their relationships. It extends Propositional Logic by introducing predicates,
constants, variables, and quantifiers. FOL allows for more expressive representation of
knowledge and enables reasoning about specific instances and their relationships.
15. Learning refers to the process of acquiring knowledge, skills, or behaviors through
experience, study, or instruction.
16. Supervised learning and unsupervised learning are two main categories of machine
learning:
- Supervised learning: The algorithm learns from labeled training data, where the correct
outputs (labels) are provided. The goal is to learn a mapping function from input data to output
labels. Examples include classification and regression tasks.
- Unsupervised learning: The algorithm learns from unlabeled data, where no target outputs are
provided. The goal is to find patterns, structures, or relationships within the data. Examples
include clustering and dimensionality reduction tasks.
17. Reinforcement learning is a type of machine learning where an agent learns to make
decisions by interacting with an environment and receiving rewards or penalties for its actions.
The agent's goal is to learn a policy (a mapping from states to actions) that maximizes the
expected cumulative reward over time.
- Recommendation systems
19. In a decision tree, an attribute is selected as a node based on its ability to provide the most
information gain or the highest reduction in entropy (impurity) for classifying the instances.
20. Information Gain is a metric used in decision tree algorithms to measure the effectiveness
of an attribute in splitting the training data based on the reduction in entropy or impurity. It
quantifies the expected decrease in impurity or uncertainty after splitting the data based on a
given attribute.
21. To create a decision tree using the ID3 algorithm for a given dataset, follow these steps:
2) For each attribute, calculate the entropy after splitting the data based on that attribute.
3) Calculate the Information Gain for each attribute by subtracting the weighted average entropy
after the split from the original entropy.
4) Select the attribute with the highest Information Gain as the root node.
5) Repeat steps 2-4 recursively for each branch of the tree until all instances are classified or no
further splits are possible.
22. Market Basket Analysis (MBA) works by analyzing transactional data, such as customer
purchases, to identify patterns and associations between items frequently bought together. It
helps retailers understand customer buying behaviors and make informed decisions about
product placement, promotions, and cross-selling opportunities.
- Sequence Analysis: Analyzing the order in which items are purchased to identify temporal
patterns.
24. Dependency refers to a direct relationship between two variables, where the value of one
variable depends on the value of the other. Conditional dependency refers to a relationship
between two variables that is dependent on the value of a third variable.
25. A Bayesian network provides a compact and intuitive representation of a joint probability
distribution over a set of random variables. It models the conditional dependencies between
variables using a directed acyclic graph (DAG) and provides a framework for probabilistic
reasoning and inference.
- Supervised Learning: Decision Trees, Naive Bayes, Support Vector Machines, Linear/Logistic
Regression, Neural Networks
- Sentiment analysis
- Text classification
- Recommendation systems
28. An expert system is a computer program that uses knowledge from human experts to solve
complex problems in a specific domain. It is designed to mimic the decision-making ability and
reasoning process of human experts.
29. A rule-based system is a type of expert system that uses a set of rules or knowledge base to
perform reasoning and make decisions. These rules are typically represented in an IF-THEN
format and are derived from human expertise.
30. Natural Language Processing (NLP) is a branch of artificial intelligence that deals with the
interaction between computers and humans using natural languages (spoken or written). The
main types of NLP tasks include:
- Speech Recognition
- Machine Translation
- Text Summarization
- Sentiment Analysis
- Question Answering
- Idioms and Figurative Language: Idioms and metaphors can be difficult for machines to
understand.
- Context Sensitivity: The meaning of words and phrases can depend heavily on the context.
- Varying Grammar Rules: Different languages have different grammar rules, making it
challenging to develop universal NLP systems.
- Lack of Common Sense Knowledge: Machines often lack the common sense knowledge that
humans possess.
- MYCIN: A medical expert system for diagnosing and treating infectious diseases.
- DENDRAL: An expert system for analyzing organic compounds using mass spectrometry data.
- XCON: An expert system used by Digital Equipment Corporation for configuring computer
systems.
- Inference Engine: Uses the knowledge base and applies reasoning techniques to solve
problems or draw conclusions.
- User Interface: Allows users to interact with the system, provide input, and receive output.
- Explanation Facility: Provides explanations and justifications for the system's decisions or
recommendations.
- Knowledge Acquisition Module: Facilitates the process of acquiring knowledge from human
experts and updating the knowledge base.
34. Robots are programmable machines capable of performing complex tasks with a high
degree of accuracy and precision. They are typically designed to replicate or assist human
actions.
- Control System: The "brain" of the robot, responsible for processing instructions and
controlling the robot's movements.
- Sensors: Devices that collect data from the environment, such as cameras, infrared sensors,
or proximity sensors.
- Actuators: Components that convert energy into motion, such as motors, hydraulic pistons, or
grippers.
- Power Source: Provides the necessary energy for the robot to function, such as batteries or a
power supply.
- End Effectors: Tools or devices attached to the robot's arm or body for performing specific
tasks, such as grippers, welding tools, or painting tools.
36. Actuators in robotics are devices that convert energy into motion or force, enabling robots to
move and interact with their environment. Common types of actuators include:
- Piezoelectric Actuators: Use the piezoelectric effect to generate precise, small-scale motion.
- Shape Memory Alloy Actuators: Change shape when heated or cooled, allowing for controlled
motion.
37. Natural Language Processing (NLP) is a branch of artificial intelligence that deals with the
interaction between computers and humans using natural languages (spoken or written). It
focuses on developing systems and algorithms that can understand, interpret, and generate
human language.
- Virtual Assistants (e.g., Siri, Alexa, Google Assistant): These assistants use NLP to understand
and respond to voice commands and queries in natural language.
- Sentiment Analysis: NLP is used to analyze text data (e.g., customer reviews, social media
posts) and determine the underlying sentiment, emotion, or opinion expressed.
39. Stop words in NLP are common words (e.g., "the," "is," "and," "or") that are filtered out during
text preprocessing because they typically do not carry much meaningful information for
analysis or processing.
40. NLTK (Natural Language Toolkit) is a suite of libraries and programs for working with human
language data in the Python programming language. It provides easy-to-use interfaces for tasks
such as text preprocessing, tokenization, stemming, tagging, parsing, and semantic reasoning.
41. Syntactic Analysis in NLP refers to the process of analyzing the structural or grammatical
aspects of language. It involves breaking down sentences into their constituent parts (e.g.,
nouns, verbs, adjectives) and understanding the relationships between them based on the rules
of grammar.
42. Semantic Analysis in NLP involves understanding the meaning and interpretation of
language constructs, beyond just their syntactic structure. It aims to capture the intended
meaning, context, and relationships between words and concepts in natural language.