Chapter 2 AI
Chapter 2 AI
Chapter 2 AI
Topics to be covered
• Agents and environments
Chapter-2 • Rationality & Omniscience
• Environment types
• Agent types
• Learning agents
4/1/2023 1 4/1/2023 2
Agents
Agents and environments
How do you design an intelligent agent?
• An agent is anything that can be viewed as perceiving its
environment through sensors and acting upon that
environment through actuators/ effectors
• A discrete agent receives percepts one at a time,
and maps this percept sequence to a sequence of • The agent function maps from percept histories
discrete actions. to actions:
• Properties [f: P* A]
–Autonomous
–Reactive to the environment • The agent program runs on the physical
–Pro-active (goal-directed) architecture to produce f
–Interacts with other agents • agent = architecture + program
via the environment
4/1/2023 3 4/1/2023 4
4/1/2023
Example Rationality
– You are walking along the road to Adama; You see
an old friend across the street. There is no traffic. • This points out that rationality is concerned with
– So, being rational, you start to cross the street. expected success, given what has been perceived.
– Meanwhile a big banner falls off from above and • Crossing the street was rational, because most of the
before you finish crossing the road, you are flattened. time, the crossing would be successful, and there was
no way you could have foreseen the falling banner.
Were you irrational to cross the street?
4/1/2023 11 4/1/2023 12
4/1/2023
•In summary what is rational at any given point depends on • Therefore, a system is not autonomous if it is guided by
four things. its designer according to a prior decisions.
a) Everything that the agent has perceived so far
b) What an agent knows about the environment • To survive, agents must have:
c) The actions that the agent can perform Enough built-in knowledge to survive.
d) The performance measure that defines degrees of success;
The ability to learn.
4/1/2023 13 4/1/2023 14
PEAS PEAS…
• PEAS: Stands for: Performance measure, Environment, • Must first specify the setting for intelligent agent design
Actuators, Sensors • Consider, e.g., the task of designing an automated taxi
• Must first specify the setting for intelligent agent design driver:
• Consider, e.g., the task of designing an automated taxi – Performance measure:
driver: • Safe, fast, legal, comfortable trip, maximize profits
– Environment:
– Performance measure • Roads, other traffic, pedestrians, customers
– Environment – Actuators:
• Steering wheel, accelerator, brake, signal, horn
– Actuators – Sensors:
• Cameras, sonar, speedometer, GPS, odometer, engine sensors, keyboard
– Sensors
4/1/2023 15 4/1/2023 16
4/1/2023
PEAS… PEAS…
• Agent:
• Agent:
– Medical diagnosis system
– Interactive English tutor
• Performance measure: • Performance measure:
– Healthy patient, minimize costs, lawsuits – Maximize student's score on test
• Environment: • Environment:
– Set of students
– Patient, hospital, staff
• Actuators:
• Actuators:
– Screen display (exercises, suggestions, corrections)
– Screen display (questions, tests, diagnoses, treatments, referrals)
• Sensors: Keyboard
• Sensors:
4/1/2023
– Keyboard (entry of symptoms, findings, patient's answers)17 4/1/2023 18
4/1/2023 23 4/1/2023 24
4/1/2023
– E.g. for a Percept (“ red light in a traffic system”)the keeps track of the percept sequence and just looks
Environment
What the world
• Has a component to extract feature is like now
• There is no access to complete state of the world
• Works only if correct decision can be made on basis of current What action I
Condition - action rules
percept. should do now
• Problems
effectors
– Still usually too big to generate and to store
– Still no knowledge of non-perceptual parts of state function SIMPLE-REFLEX-AGENT(percept) returns action
– Still not adaptive to changes in the environment; requires static: rules, a set of condition-action rules
collection of rules to be updated if changes occur state INTERPRET-INPUT (percept)
– Still can’t make actions conditional on previous state rule RULE-MATCH (state,rules)
action RULE-ACTION [rule]
return action
4/1/2023 31 4/1/2023 32
4/1/2023
Environment
situation (as defined by the percept and the stored internal state) What my actions do
–If the car is a recent model -- there is a centrally mounted brake light. With older
models, there is no centrally mounted, so what if the agent gets confused?
Is it a parking light? Is it a brake light? Is it a turn signal light? Condition - action rules
What action I
should do now
–Some sort of internal state should be in order to choose an action.
effectors
–The camera should detect two red lights at the edge of the vehicle go ON or OFF
simultaneously. function REFLEX-AGENT-WITH-STATE (percept) returns action
static: state, a description of the current world state
•The driver should look in the rear-view mirror to check on the rules, a set of condition-action rules
location of near by vehicles. In order to decide on lane-change the state UPDATE-STATE (state, percept)
driver needs to know whether or not they are there. The driver sees, rule RULE-MATCH (state, rules)
and there is already stored information, and then does the action action RULE-ACTION [rule]
state UPDATE-STATE (state, action)
associated with that rule. return action
4/1/2023 33 34
4/1/2023
Environment
What my actions do
•Involves consideration of the future:
What it will be like
–Knowing about the current state of the environment is not always enough to if I do action A
decide what to do.
For example, at a road junction, the taxi can turn left, right or go straight. Goals
What action I
should do now
–The right decision depends on where the taxi is trying to get to. As well as a
current state description, the agent needs some sort of goal information, which effectors
describes situations that are desirable. E.g. being at the passenger's destination.
•The agent may need to consider long sequences, twists and turns to Decision making of this kind is fundamentally different from the condition-
find a way to achieve a goal. action rules described earlier. It involves
1. What will happen if I take such and such action?
2. Will that enable me reach goal?
4/1/2023 35 36
4/1/2023
4/1/2023
For e.g., there are many action sequences that will get the taxi to its How the world evolves What the world
destination, thereby achieving the goal. Some are quicker, safer, is like now
What my actions do
more reliable, or cheaper than others. We need to consider Speed
What it will be like
Environment
and safety if I do action A
• When there are several goals that the agent can aim for,
non of which can be achieved with certainty. Utility Utility
How happy I will
be in such as a state
provides a way in which the likelihood of success can be
weighed up against the importance of the goals. What action I
should do now
• An agent that possesses an explicit utility function can effectors
make rational decisions.
4/1/2023 37 4/1/2023 38
Thank You!
?
4/1/2023 41