0% found this document useful (0 votes)
20 views28 pages

RoboDesign 1

The document discusses the key components of robot software design including environment, sensors, sensing, planning, acting, and actuators. It describes each component in detail and provides examples to illustrate how information flows through the robot system from sensors to actuators. The document also discusses different types of robots and how their software architecture can be structured in a common way.

Uploaded by

Alaa Al mansore
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
20 views28 pages

RoboDesign 1

The document discusses the key components of robot software design including environment, sensors, sensing, planning, acting, and actuators. It describes each component in detail and provides examples to illustrate how information flows through the robot system from sensors to actuators. The document also discusses different types of robots and how their software architecture can be structured in a common way.

Uploaded by

Alaa Al mansore
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 28

Robot Software Design

Part 1

Dr Samir M. Falih
Introduction
Very robot system can be conceptually split into six
distinct parts:
• Environment
• Sensors
• Sensing
• Planning
• Acting
• Actuators.

3/13/2024 Introduction 2
The figure below illustrates how the six parts belong
together and how information flows through the
robot system from left to right.

3/13/2024 Introduction 3
Types of Robots – Software
Architecture
• Robots will have different sizes, perform different
functions and show more or less inner structure,
but they will still be easy to recognize.
• The next figure shows three different robot types
and two different emphasis on subsystems.
• The important part is to realize that widely different
types of robots can be seen and structured in a
common way

3/13/2024 Types of Robots 4


3/13/2024 Types of Robot 5
Environment
• The environment is the first part because the robot
system needs to acquire information about the
environment in order to compute the next action
towards achieving the intended task.
• The environment is the last part because once
information has been processed and the next
action is executed the state of the environment is
modified via this action.
• Once the sequence environment, sensor, sensing,
planning, acting, actuators and environment is
finished, it starts again from the beginning.
3/13/2024 6
• A common way to talk about this circular structure
is to state that the environment closes the loop of
the robot system.

• This loop does not lead to the same result over and
over again because does not return sense() the
same each time and because does not lead to the
same effect each time, even if state act() the
parameter is identical.

3/13/2024 7
Sensors
• Sensors are a robot’s input devices.
• They measure various physical quantities in the
environment and make these measurements
available as data to the sensing part of the robot
software system.
• Sensors are considered to be hardware device and
treated as black boxes with a known input-output
behavior and an API that makes this data available
for processing.
• The sensor input is some physical quantity in the
environment and the output is data representing a
measurement of this quantity.
3/13/2024 Sensors 8
In order to give you an initial idea what a sensor is
here are some examples:
• A temperature sensor has heat as input and
outputs a temperature in units of Kelvin, Celsius or
Fahrenheit.
• A camera sensor has light as input and outputs a
(two dimensional) array of spatial light intensities
as output.
• A joint position sensor has the relative rotation of
two bodies as input and outputs their angular
rotation in increments, degrees or radians.

3/13/2024 Sensors 9
Sensing
• The purpose of the sensing part is to give meaning
to the raw data received from the sensors.
• As usual in software development, defining good
abstractions and good interfaces between different
subsystems is crucial .
• The input to sensing is raw sensor data and the
output is meaningful information that we can use
to plan towards achieving the robot’s objectives.

3/13/2024 Sensing 10
• example: Locating a mobile robot in a known map
based on data from a laser scanner is part of
sensing.
• Finding a path to a target location from the current
position, requires having the target location as an
objective and is, thus, part of planning.
• The input to sensing is raw sensor data and the
output is meaningful information that we can use
to plan towards achieving the robot’s objectives.

3/13/2024 Sensing 11
3/13/2024 Sensing 12
Planning
• Given an objective and information about the state
of the robot and its environment, the planning part
calculates the robot’s next action(s) towards
achieving the objective.
• Planning is the part of robot software that is most
isolated from the real world, given its position
between sensing and acting.
• The latter abstract away interfacing with hardware
and performing physical interaction.
• Although planning is thus the closest part to pure
software in a robotics system, the actual processing
is still very much specific to robotics.
3/13/2024 Planning 13
• There are many basic algorithms and data
structures used in robot planning from software in
general.
• So is the relationship between information
processing and events in the real world.
• All data going into planning refers to something in
the real world mediated by the sensing part.
• All results of planning translate to actions taken in
the real world executed via the acting part.

3/13/2024 Planing 14
Industrial Manipulator - Planning
• Objective
• Move gripper attached to manipulator from
current pose into a specific target pose.
• Planning
• Calculate the positions for all manipulator joint
motors that result in the gripper being in the
target pose.
• Send next (intermediate) motor positions to the
acting part to execute motion.

3/13/2024 Planning 15
Quadcopter - Planning
• Objective:
• Find and land on a big yellow H marked on the
floor inside a building.
• Planning:
• Calculate a collision free path through the
immediate surrounding area to explore the
entire building.
• If a yellow H is detected, perform a landing
procedure.
• Incrementally send next relative position to
acting part for execution
3/13/2024 Planning 16
Mobile Robot Fleet - Planning
• Objective
• Under the assumption of having a central
coordinator, find the optimal assignment
between a fleet of mobile robots to transport
tasks within a logistics environment that
minimizes total distance travelled and
maximizes the distance between all robots
while moving.

3/13/2024 17
• Planning
• Discretize the map of the environment and
the current position of all robots into a graph
and utilize graph algorithms to find a
solution. Transform this solution back into
paths on the actual map and send each
robot’s path to the acting part. Monitor all
robots following their intended path and
calculate new paths in case of deviations.

3/13/2024 Planning 18
Acting
• Once an action has been decided in the planning
part, it must be carried out.
• The acting part is taking care of transforming
planned actions to performed actions.
• we also need to define a boundary between
planning and acting. We draw this boundary along
the line of global action and local action, which can
also be stated as action with context and action
without context.
• Global action or action with context belongs to the
planning part, as it has all the information available
about the system state and objectives.
3/13/2024 Acting 19
• Local action or action without context is, hence,
where the acting part begins.

3/13/2024 Acting 20
Planning-Acting
• A robot with one wheel on the left side and one
wheel on the right side, each driven by its own
motor, a so called differential drive system.
• Assuming both wheels touch the ground, the
resulting motion of the robot will depend on the
motion of both wheels.
• Without going into too much detail here, if only
one wheel rotates, the robot would rotate around
the other one.
• If both wheels rotate at the same velocity, the
robot would move in a straight line.
3/13/2024 Planning Acting 21
• Other combinations will result in a combination of
forward motion and rotation, the robot makes a
turn.
• A good interface from planning to acting in this
example would be to command a linear and
angular velocity, a so called twist to acting.
• Acting is then responsible to calculate the velocity
of each wheel and control each wheels motor with
this velocity.
• Whether the resulting motion actually moves the
robot into the direction intended by the planning
part depends on many external factors such as the
surface the robot is driving on, the payload carried
and many others.
3/13/2024 Planning -Acting 22
• We always leave it to the planning part to make
sure that executed actions work towards the
robot’s objective.
• The acting part only ensures that commanded
actions are properly transformed to actuator
control commands, which are executed via the
actuator hardware.

3/13/2024 Planning - Acting 23


Actuators
• In many ways, actuators are the inverse of sensors.
• While sensors convert physical quantities to data,
actuators do the opposite and convert data into
physical quantities.
• For example, a temperature sensor measures heat
and provides it as data. A heating element is an
actuator that converts data into heat.
• To give a more robotics relevant example, a
position sensor converts rotation to angular data
and a (properly controlled) motor converts angular
data to rotation.

3/13/2024 Actuators 24
• The actuators purposefully alter the environment
and robot state to bring our robot one step closer
to achieving its objectives.
• This preliminary end point is not just the beginning
of the next loop through the robot system, but also
only the beginning of your journey into robotics.

3/13/2024 Actuators 25
Robot software development
lifecycle

3/14/2024 Robot software development lifecycle 26


Summary
• The six conceptual building blocks of robot systems
are: Environment, Sensors, Sensing, Planning, Acting
and Actuators.
• Sensors and actuators of a robot system are its inputs
from and outputs to the environment.
• Sensing transforms raw sensor data to meaningful
information that is passed to planning.
• Planning derives actions that bring the robot one step
closer to achieving its task from the current robot and
environment state.
• Acting transforms commands from planning into
physical actions that alter the actual environment and
robot state.
3/13/2024 summery 27
• Robot software development is an iterative process
starting with a clear definition of the robot’s
objective and the environment in which this
objective is to be achieved.

3/13/2024 Summery 28

You might also like