0% found this document useful (0 votes)
16 views2 pages

Unit 3 AI

Uploaded by

Shree Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
16 views2 pages

Unit 3 AI

Uploaded by

Shree Krishna
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 2

Unit 3: Knowledge Representation and Reasoning 3.

2 Knowledge Representation Techniques Summaryknowledge representation technique has its


3.1 Definition and Importance of Knowledge In artificial intelligence, various techniques are used to strengths and is suitable for different applications:
Collection of information, facts, and principles represent knowledge. Each technique has its Rule-Based Systems: Ideal for decision-making
acquired through observation, experience, and study. advantages and is suitable for different types of processes and expert systems.
Structured data for machines to use in reasoning, knowledge and applications. Semantic Networks: Useful for representing
problem-solving, and decision-making. Rule-Based Representation : systems use a set of "if- relationships and hierarchical structures.
Importance of Knowledge: then" rules to represent knowledge. These rules specify Frames: Suitable for structured, stereotype-based
Decision Making: Informed decisions using past data actions to be taken when certain conditions are met. knowledge representation.
and logical reasoning. Problem Solving: Application Components: Logic-Based Representation: Best for formal
of known principles and rules to new problems. Rules: Conditional statements in the form of "if reasoning and complex problem-solving tasks.
Efficiency: Improved efficiency and performance of AI (condition) then (action)". Selecting the appropriate technique depends on the
systems. Learning: Adaptation and learning from Inference Engine: Applies the rules to known facts to nature of the knowledge and the requirements of the
stored knowledge. Communication: Knowledge infer new information. specific AI application.
sharing and transfer in collaborative systems. Example: IF it is raining THEN carry an umbrella. 3.3 Propositional Logic
Issues in Knowledge Representation: IF it is cold THEN wear a jacket. a jacket. Syntax and Semantics of Propositional Logic
Complexity: Capturing and organizing complex real- Advantages: Easy to understand and implement. Syntax:
world information. Incompleteness: Incomplete or Good for capturing expert knowledge and decision- Propositions: Basic statements that can be true or
missing crucial details. Inconsistency: Conflicting making processes. Modular and easy to update with false, typically represented by symbols like 𝑝,𝑞,𝑟p,q,r.
information from different sources. Scalability: new rules. Logical Connectives:
Disadvantages:May become complex and Negation (¬): Not. Conjunction (∧): And.
Maintaining and updating growing knowledge bases.
unmanageable with a large number of rules. Disjunction (∨): Or. Implication (→): If... then.
Expressiveness vs. Efficiency: Balancing rich
representation with computational efficiency. Difficult to handle exceptions and conflicting rules. Biconditional (↔): If and only if. Formulas:
Semantic Networks Constructed from propositions and connectives.
Semantic Ambiguity: Ensuring clear and unambiguous
Definition: A semantic network represents knowledge Examples: 𝑝p, ¬𝑝¬p, 𝑝∧𝑞p∧q, 𝑝∨𝑞p∨q, 𝑝→𝑞p→q.
knowledge representation.
as a graph of nodes connected by edges. Nodes Semantics:
Knowledge Representation Systems
A Knowledge Representation System (KRS) is a represent concepts or entities, and edges represent Truth Values: Each proposition can be either true (T)
framework for encoding information about the world relationships between them. or false (F).
in a form that a computer system can utilize to solve Components: Truth Tables: Used to define the truth value of a
complex tasks. Components of a KRS include: Nodes: Represent objects, concepts, or entities. compound formula based on the truth values of its
Knowledge Base: repository of facts, rules,heuristics. Edges: Represent relationships or associations components.
Inference Engine: The component that applies logical between nodes. Example: Example of a truth table for 𝑝∧𝑞p∧q:
rules to the knowledge base to derive new information. (Node: "Cat") -- (Edge: "is a") --> (Node: "Animal") 𝑝p 𝑞q 𝑝∧𝑞p∧q
User Interface: Allows users to interact with the (Node: "Cat") -- (Edge: "has") --> (Node: "Tail")
T T T
system and query the knowledge base. Advantages: Intuitive visualization of relationships
T F F
Learning Module: Enables the system to update and between concepts. Good for representing hierarchical
expand its knowledge base over time. structures and taxonomies. Facilitates inheritance of F T F
Properties of Knowledge Representation Systems properties in hierarchical structures F F F
Representational Adequacy: The ability to represent Disadvantages: Can become complex with large
Proof by Resolution
all types of knowledge needed. networks. Not suitable for representing procedural
Resolution: A rule of inference used for propositional
Inferential Adequacy: The capability to derive new knowledge.
logic and first-order logic. It is based on the principle
knowledge from the represented information. Frames Frames are data structures for representing
that if a statement 𝐴∨𝐵A∨B is true and ¬𝐴∨𝐶¬A∨C is
Inferential Efficiency: The efficiency with which stereotyped situations. They group related information
true, then 𝐵∨𝐶B∨C must also be true.
inferences can be made. into slots (attributes) and their values.
Steps in Proof by Resolution:
Acquisitional Efficiency: The ease with which new Components:
Convert all formulas to Conjunctive Normal Form
knowledge can be integrated into the system. Frames: Structures that represent objects or concepts.
(CNF): CNF is a conjunction of disjunctions of literals.
Types of Knowledge Slots: Attributes or properties of the frames.
Apply the resolution rule: Repeatedly apply the
Declarative Knowledge: Facts and information that Values: Values assigned to the slots. Example:
resolution rule to pairs of clauses to derive new
describe the world. (Node: "Cat") -- (Edge: "is a") --> (Node: "Animal")
clauses.
Examples: "Paris is the capital of France." (Node: "Cat") -- (Edge: "has") --> (Node: "Tail") Frame:
Check for Contradiction: If a contradiction (an empty
Procedural Knowledge: Knowledge of how to Advantages: Organizes knowledge into structured clause) is derived, the original set of clauses is
perform tasks. Examples: "How to ride a bicycle." units, making it easier to manage. Supports unsatisfiable.
Semantic Knowledge: General knowledge about the inheritance, where frames can inherit properties from
Conjunctive Normal Form (CNF)
world, concepts, and categories. Examples: parent frames. Flexible and extensible.
Definition: A formula is in CNF if it is a conjunction
Understanding that a 'cat' is a type of 'animal.' Disadvantages: Can be less efficient for certain types
(AND) of one or more clauses, where each clause is a
Episodic Knowledge: Knowledge of personal of reasoning. Complex relationships between frames
disjunction (OR) of literals.
experiences and specific events. can be difficult to manage.
Conversion to CNF:
Examples: "I visited Paris in 2019." Logic-Based Representation Logic-based
Eliminate biconditionals and implications:
The Role of Knowledge representation uses formal logic to represent
𝐴↔𝐵A↔B becomes (𝐴→𝐵)∧(𝐵→𝐴)(A→B)∧(B→A)
knowledge. It employs logical statements and
Foundation for AI: Knowledge is the cornerstone for 𝐴→𝐵A→B becomes ¬𝐴∨𝐵¬A∨B
inference rules to express facts and relationships.
developing intelligent systems that can mimic human Move negations inward using De Morgan's laws:
Components: Propositions: Statements that can be
cognition. ¬(𝐴∧𝐵)¬(A∧B) becomes ¬𝐴∨¬𝐵¬A∨¬B
true or false. Predicates: Functions that express
Problem-Solving: Facilitates logical reasoning and ¬(𝐴∨𝐵)¬(A∨B) becomes ¬𝐴∧¬𝐵¬A∧¬B
properties or relationships. Inference Rules: Logical
helps AI systems to solve complex problems by Distribute disjunction over conjunction: Ensure that
applying known principles. rules used to derive conclusions from premises.
the resulting formula is a conjunction of disjunctions.
Learning: Provides the basis for machine learning, Example: Fact: Loves(John, Mary)
Resolution Algorithm
where systems use existing knowledge to learn from Rule: ∀x ∀y (Loves(x, y) ∧ Loves(y, x) → Friends(x, y))
Input: A set of clauses in CNF.
new data. Advantages:Precise and unambiguous representation.
Initialization: Set the goal (negation of the statement
Communication: Essential for enabling intelligent Strong foundation in mathematical logic, allowing for
to be proved).
agents to communicate and collaborate effectively by rigorous reasoning.Well-suited for complex problem-
Resolution:Apply the resolution rule to pairs of
sharing knowledge. solving and formal proofs.
clauses to derive new clauses.Add the derived clauses
Adaptation: Helps systems to adapt to new Disadvantages:
to the set.Check if the empty clause is derived.
environments and evolving situations by building on Can be computationally intensive and difficult to scale.
Output: If the empty clause is derived, the original set
previous knowledge. Requires expertise in formal logic to create and
of clauses is unsatisfiable (the negation of the
interpret.
statement is true).
Limitations of Propositional Logic Quantification: Universal Quantification (∀): 3.5 Handling Uncertain Knowledge In real-world
Expressiveness: Cannot represent relationships Example: ∀x (P(x) → Q(x)) means "For all x, if P(x) scenarios, knowledge is often uncertain due to
between objects or generalizations about them. then Q(x)." incomplete information or inherent randomness.
Scalability: As the number of propositions increases, Existential Quantification (∃): Example: ∃x (P(x) ∧ Handling uncertainty is crucial for making informed
the size of truth tables and complexity of resolution Q(x)) means "There exists an x such that P(x) and decisions and predictions in AI systems.
increases exponentially.Lack of Quantifiers: Cannot Q(x)." Random Variables :variable is a variable that can take
express statements involving 'all' or 'some' Horn Clauses :A Horn clause is a special type of clause on different values, each associated with a probability.
(quantifiers). with at most one positive literal. It is used extensively There are two types of random variables:
Forward and Backward Chaining in logic programming and automated reasoning. Discrete Random Variables: Can take on a finite or
Forward Chaining: Types: Definite Clause: A Horn clause with exactly countably infinite set of values.
Method: Start with known facts and apply inference one positive literal (e.g., P(x) ∨ ¬Q(x) ∨ ¬R(x)). Continuous Random Variables: Can take on any
rules to derive new facts until the goal is reached. Fact: A definite clause with no negative literals (e.g., value within a range. Example:
Process: P(a)). Rule: A definite clause with one positive literal Discrete: The outcome of rolling a die (1, 2, 3, 4, 5, 6).
1. Initialize with a set of known facts. and one or more negative literals (e.g., P(x) ∨ ¬Q(x)). Continuous: The height of a person.
2. Apply rules whose premises match the known facts. Inference with FOPL Prior and Posterior Probability
3. Add the conclusions of the rules to the set of known Conversion to Propositional Logic: Prior Probability (P(A)): The initial probability of an
facts. Existential Instantiation: Replace an existentially event 𝐴A before any additional information is
4. Repeat until the goal is derived or no new facts can quantified variable with a new constant. considered.
be inferred. Example: ∃x P(x) becomes P(c) where c is a new Posterior Probability (P(A|B)): The probability of an
Backward Chaining : Method: Start with the goal and constant. event 𝐴A given that another event 𝐵B has occurred. It
work backward, looking for rules that could conclude Universal Instantiation: Replace a universally is updated based on new evidence. Example:
the goal. quantified variable with any ground term. Prior: The probability that it will rain today based on
Process: Eg : ∀x P(x) can become P(a) where a is a constant. historical data.
1. Initialize with the goal. Rules of Inference Posterior: The probability that it will rain today given
2. Find rules whose conclusion matches the goal. Common Rules: that the sky is cloudy.
3. Add the premises of these rules as new sub-goals. Modus Ponens: If P and P → Q are true, then Q is true. Inference Using Full Joint Distribution
4 Repeat until all sub-goals are known facts or cannot The full joint probability distribution represents the
Modus Tollens: If ¬Q and P → Q are true, then ¬P is
be broken down further. probability of every possible combination of variables.
true.
Comparison: For variables 𝑋1,𝑋2,…,𝑋𝑛X1,X2,…,Xn, it is given by
Universal Instantiation: From ∀x P(x), infer P(a) for
Forward Chaining: Data-driven, suitable for situations any constant a.
𝑃(𝑋1,𝑋2,…,𝑋𝑛)P(X1,X2,…,Xn).
where all data is available from the start. Inference: To find the probability of a specific event,
Existential Instantiation: From ∃x P(x), infer P(c) for
Backward Chaining: Goal-driven, suitable for sum the probabilities of all relevant outcomes.
a new constant c.
diagnostic tasks where the goal is known, but the Example: For variables 𝑋X and 𝑌Y, to find
Unification and Lifting
necessary data is derived. 𝑃(𝑋=𝑥)P(X=x), sum 𝑃(𝑋=𝑥,𝑌=𝑦)P(X=x,Y=y) over all 𝑦y:
Unification:The process of finding a substitution that
These techniques and concepts are fundamental to makes different logical expressions identical.
𝑃(𝑋=𝑥)=∑𝑦𝑃(𝑋=𝑥,𝑌=𝑦)P(X=x)=∑yP(X=x,Y=y)
reasoning in AI and form the basis for more advanced Bayes' Rule and Its Use
Example: Unifying P(x, y) with P(a, b) gives the
logical reasoning systems. Bayes' Rule: Bayes' Rule relates the conditional and
substitution {x/a, y/b}.
3.4 Predicate Logic and First-Order Predicate Logic marginal probabilities of random variables. It is
Lifting: Extends the concept of unification to handle expressed as:
(FOPL)
quantified variables&more complex expression in FOPL
Predicate Logic:extends propositional logic by dealing 𝑃(𝐴∣𝐵)=𝑃(𝐵∣𝐴)⋅𝑃(𝐴)𝑃(𝐵)P(A∣B)=P(B)P(B∣A)⋅P(A)
CNF for FOPL Use: Bayes' Rule is used to update the probability of a
with predicates, which can express relationships betwn
Conversion Steps: Eliminate biconditionals and hypothesis 𝐴A based on new evidence 𝐵B.
objects and include quantifiers to handle variables.
First-Order Predicate Logic (FOPL) implications. Move negations inward using De Example: To find the probability of a disease 𝐷D given
Definition: FOPL, also known as First-Order Logic Morgan's laws. Standardize variables apart (rename a positive test result 𝑇T:
(FOL), allows the expression of statements involving variables to avoid conflicts). 𝑃(𝐷∣𝑇)=𝑃(𝑇∣𝐷)⋅𝑃(𝐷)𝑃(𝑇)P(D∣T)=P(T)P(T∣D)⋅P(D)
objects, predicates, and quantifiers. It provides a more Skolemization: Replace existential quantifiers with Bayesian Networks
expressive framework than propositional logic. Skolem functions. Drop universal quantifiers (assumed Definition: A Bayesian Network is a graphical model
Syntax of FOPL implicitly). Distribute conjunction over disjunction to that represents the probabilistic relationships among a
Components: obtain CNF. set of variables. Nodes represent random variables,
Constants: Symbols that represent specific objects Inference Using Resolution and edges represent conditional dependencies.
(e.g., a, b). Resolution Refutation System (RRS): Components: Nodes: Represent random variables.
Variables: Symbols that can represent any object (e.g., Principle: To prove a statement by refutation, assume Edges:Directed edges indicate conditional dependenci
x, y). the negation of the statement and show that it leads Conditional Probability Tables (CPTs): Each node
Predicates: Functions that return true or false, to a contradiction. has a CPT that quantifies the effect of the parent nodes
representing relationships or properties (e.g., P(x), Steps: Example: A simple Bayesian Network for diagnosing a
Loves(x, y)). Negate the statement to be proved and convert to disease: [Flu] → [Fever]
Functions: Map objects to objects (e.g., fatherOf(x)). CNF. → [Cough]
Logical Connectives: Similar to propositional logic (¬, Combine with the knowledge base (also in CNF). Where "Flu" is the parent node, and "Fever" and
∧, ∨, →, ↔). Apply the resolution rule iteratively to derive new "Cough" are conditionally dependent on "Flu".
Quantifiers: clauses. Reasoning in Bayesian Networks
Universal Quantifier (∀): Indicates that a statement Check for an empty clause, indicating a contradiction. Inference: The process of calculating the probability
applies to all objects in the domain (e.g., ∀x P(x)). Example: of a query variable given some evidence. There are two
Existential Quantifier (∃): Indicates that a statement main methods:
Knowledge Base:
applies to at least one object in the domain (e.g., ∃x Exact Inference:Enumeration: Sum over all possible
∀x (Human(x) → Mortal(x)) becomes ¬Human(x) ∨
P(x)). Mortal(x) values of hidden variables.
Example: Human(Socrates) Variable Elimination: Systematically eliminate
∀x (Human(x) → Mortal(x)) Negate Goal: variables by summing out their probabilities.
∃x (Human(x) ∧ Loves(x, IceCream)) ∀ To prove Mortal(Socrates), negate to Approximate Inference: Sampling Methods: Use
Semantics of FOPL ¬Mortal(Socrates). random samples to estimate probabilities (e.g., Monte
Interpretation: Convert and Resolve: Carlo methods). Example: To find the probability of
Resolve ¬Mortal(Socrates) with ¬Human(Socrates) ∨ having the flu given a fever (𝑃(𝐹𝑙𝑢∣𝐹𝑒𝑣𝑒𝑟)P(Flu∣Fever)):
Domain: The set of objects being discussed.
Mortal(Socrates) to derive ¬Human(Socrates). Use the CPTs to find the joint probability of "Flu" and
Interpretation Function: Assigns meanings to
Resolve ¬Human(Socrates) with Human(Socrates) to "Fever".
constants, functions, and predicates.
derive an empty clause, proving the original statement Normalize to ensure the probabilities sum to 1.
Truth Assignment: Determines the truth value of
by contradiction.
predicates based on the interpretation function.

You might also like