0% found this document useful (0 votes)
105 views16 pages

Advanced Robotıcs: General Concept On Mobile Robotics

The document provides an overview of robotics including definitions of robots, classifications of robots by domains and applications, types of robots, and a brief history of robots. It discusses automation versus robotics and introduces concepts related to autonomous robots and intelligent robotic systems.

Uploaded by

Belalia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
105 views16 pages

Advanced Robotıcs: General Concept On Mobile Robotics

The document provides an overview of robotics including definitions of robots, classifications of robots by domains and applications, types of robots, and a brief history of robots. It discusses automation versus robotics and introduces concepts related to autonomous robots and intelligent robotic systems.

Uploaded by

Belalia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 16

I.

Introductıon
Advanced Robotıcs
General concept on mobile robotics
David FOLIO
INSA Centre Val de Loire, 5A MRI & M2 2EA

2019 – 2020

© 2019 – 2020, David FOLIO Advanced Robotics 2 / 66

 Defınıtıons  Automatıon vs. Robotıcs

Automation:
What is Robotics?
“performed with minimal human assistance”
Robotics is an interdisciplinary branch of engineering and science that includes mechanical
engineering, electronic engineering, information engineering, computer science, and so on… It manufacturing process,
deals with the design, construction, operation, and use of robots, as well as computer systems for automatic machines, CNC

their control, sensory feedback, and information processing. CNC, computer prog., etc. Robot

Smart Computer
Machines
Autonomous Robots need: device

Many types and definitions of robots: sensory system to check its status and perceive
the surrounding environment; Automatic
car
different environments and uses; tools, actuators, effectors to interact with its assistance

various applications fields: industry, agriculture, medical, domestic, military, space, etc. environment;
Sensors
a degree of autonomy (i.e. IA): programmable,
Basic robotic primitives:
adaptable, etc.
Figure 2: What is a robot?
Software

Sense: use of sensors to perceive its environment; Cognition


SENSING ACTION
Plan: interpret, resolve, control, etc.;
Act: interaction with its environment. “I can’t define a robot, but I know one when I see one.”
Hardware

Sensors Actuators – a father of robotics: Joseph Engelberger (1925-2015)


world
Real

ENVIRONMENT

© 2019 – 2020, David FOLIO Advanced Robotics 4 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 5 / 66
Figure 1: Autonomous Robots
 Classıfıcatıon by domaıns/applıcatıons  Classıfıcatıon by types

Industrial robots Stationary Robots


e.g. manipulators (e.g. manipulation);
Service robots Mobile Robotics
Hard/painful/dangerous tasks: (i.e. by locomotion type):
∘ security, military, delivery, transport,
wheeled robots;
cleaning…
walking robots;
Assistance robots
aerial robots (e.g. drone);
∘ e.g. elderly, handicap…
under-water robots, etc.
Medical robots
∘ surgery (e.g. MIS), Multiscale robotics:
biomicromanipulations… Macro-Meso-Micro-Nano-
Field robotics, etc. robotics…
Exploring Robots
hazardous, confined environments,
sea, space exploration…
Figure 3: Examples of applications

Figure 4: Type of robots

© 2019 – 2020, David FOLIO Advanced Robotics 6 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 7 / 66

 Brıef hıstory of robots  Towards AI Robots…

Half-century of research on autonomous robots: GOAL: Autonomous Intelligent Robotic System


Unimate (1961): 1st digitally operated and programmable industrial robot
Autonomy:
Hilare (LAAS, 1977): 1st French mobile robot
(Hilare, 1977) (Asimo, 1993) (Pepper, 2014)
“Autonomous robots operate independently without support of an human operator”
(Waymo, 2016) – Murphy (2001)
(Unimate, 1961)
Emotion
Autonomous Robots act in real-world environments for some longer time without external
Co-worker
telemanipulators compagnion control – Bekey (2005)
compliant ...
Cobotic
Intelligent:
Small scale robotics
Intelligent Robots are machines that perceive, think and act – Bekey (2005)
2010

Traditional industrial robots: A rational agent acts to maximizes its performance measures given the evidence provide by
repetitive, high frequency, high precision… a perception sequence and built-in knowledge – Russel, Norvig (2003)
∘ e.g. pre-computed motions, pre-planned path…
few Human-Robots interaction
∘ new trend: cobotics

© 2019 – 2020, David FOLIO Advanced Robotics 8 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 9 / 66
 Some key ıssues  Robots example: Atlas

Mobile robotics: Atlas, by Boston Dynamics <https://www.bostondynamics.com/atlas>


Numerous systems for a wide range of applications; robot for the DARPA Robotics Challenge (DRC)
∘ issues: complex task remains challenging; advanced humanoid robot
Many problems in various scientific fields are still unsolved ∘ joints: 28, weight: 75kg, height:1.5m
Actuators Sensors g

Computer Vision
Acquision
kin Learning
a remarkable capabilities
n M Manipulation ∘ walking, running, climbing stairs, obstacle avoidance…
Emotion Dependability
De
cis
io

Navigation
Representation Grasping
Interaction no high-level control

Mapping Path Planning Decision


SLAM
e
Augmented Reallity le dg
Knowledg ow

Localization
e Repres
entation Path Kn

Reasoning Object Recognition Human Interpretation many more…

Important economic issues and opportunities:


∘ Assistance/Service robots, new health/biomedical applications, etc.; <https://www.bostondynamics.com/atlas>
∘ Outdoor robots (e.g. military, urban, aerial, etc.),
↪ Esp. autonomous (self-driving) car;
∘ Environmental monitoring, response after disaster
© 2019 – 2020, David FOLIO
(e.g. Fukushima), etc.
Advanced Robotics 10 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 11 / 66

 Robots example: Europa  Robots example: Self-drıvıng car

Europa: European Robotic Pedestrian Assistant Google self-driving car (now as Waymo, since 2016)
EU project, FP7/2007-2013 Autonomous electric car;
Service robot Numerous sensors: camera, high precision 3D laser, IMU, GPS…
remarkable capabilities remarkable capabilities
∘ Navigation in densely populated urban environments ∘ cars are able to drive autonomously in urban environments,
∘ World modeling (SLAM) ∘ very detailed maps,
∘ Advanced object recognition (sidewalks, cars, pedestrian…) ∘ locations of traffic lights, sign, pedestrians…
∘ remission map of road markings, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 12 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 13 / 66
 Road-map to Autonomous Intellıgent Robotıcs  Robotıc paradıgm

The design of advanced robots involves numerous functionalities Paradigm


Actuators Sensors g Learning
a philosophical and theoretical framework of a scientific school or discipline within which theories,

Computer Vision
Acquision in
ak
Understanding and modeling the system Emotion i onM Manipulation
Dependability
De
cis laws, and generalizations and the experiments performed in support of them are formulated.
∘ Kinematics, dynamics, motion…
Navigation
Representation Grasping
Interaction

∘ Reliable feedback,
∘ Task planning, etc. Mapping Path Planning Decision How to organize the function/intelligence?
SLAM
e
Augmented Reallity ledg
Integration of sensors, actuators, power… Knowled ow

Localization
ge Repre
sentation Path Kn
↪ robot paradigms are used for this organization
Perception and interaction with environment
∘ Knowledge representation Reasoning Object Recognition Human Interpretation many more… 3 major paradigms for intelligent robots
∘ Understanding the “robot environment” Think Think
Decision making Behavior 1. Reactive Paradigm Decision making Behavior
∘ Coping with noise and uncertainty Visualization
∘ direct coupling of sensors and actuators Visualization
Mission
Mission
∘ Intuitive Human-Robot Interfaces (e.g. “natural Sense Navigation Planner
∘ compare to a reflex (i.e. sensorimotor)
Sense Navigation Planner
World Model Act
World Model Act
command” for inhabitants) Communication
Robot status Communication 2. Hierarchical Paradigm Robot status

Creation of flexible/robust control policies Filter Gesture/posture ∘ uses a planning and reasoning component Filter
Interaction Human Interaction
Gesture/posture
Low-level controller
∘ Control has to deal with new situations Tracking
Interaction Human Interaction Low-level controller
∘ abstract description of goals, tasks, capabilities…
Tracking

∘ Learning capabilities drive


Sensor Sensor Sensor Real Environment
drive steer Actuator

Sensor Sensor Sensor steer Actuator 3. Hybrid (deliberative/reactive) Paradigm


∘ Planning and problem solving Real Environment
∘ synthesis of reactive and deliberative concept
∘ able to maintains different abstraction levels
∘ able to maintains different temporal granularities
© 2019 – 2020, David FOLIO Advanced Robotics 14 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 15 / 66

 Informal defınıtıons

Motion study of the process of causing a robot to move.

Kinematics study of the motion of rigid-bodies that are connected with joints.

II. Modelıng Deals with the geometric relationships that govern the robotic system;
Deals with the relationship between control parameters and the behavior of a
system in state-space.
Does not consider the forces that affect the motion.

Dynamics study of motion in which the forces are modeled;

Includes the energies and speeds associated with these motions.

© 2019 – 2020, David FOLIO Advanced Robotics 16 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 18 / 66
 Informal defınıtıons (from robotıc manıpulatıon)  Informal defınıtıons (from robotıc manıpulatıon)

Workspace space (Ω) in which the robot can move or reach; Configuration space (or C-space) space (𝒞) describing uniquely the state of a robot;

To Ω is related a reference frame ℱ0 : Cartesian product of the state-space of each joint (i.e. moving part);
2D: Ω ⊂ ℝ2 , and ℱ0 ∶ {𝑂, 𝑥, 𝑦}; Initially used for robotic arms, but useful for mobile robotic planning
3D: Ω ⊂ ℝ3 , and ℱ0 ∶ {𝑂, 𝑥, 𝑦, 𝑧}; 4D, etc.
Robot configuration vector 𝑞 ∈ 𝒞 (i.e. generalized coordinates) of independent
Task space (or Cartesian space) space (ℳ) where the robot posture are expressed parameters uniquely specifying the robot’s state.
For mobile robots, often:
in 2D: ℳ = SE(2) ≡ ℝ2 × 𝕊1 Examples Ω C-Space (𝒞)

(𝕊 is SO(2), i.e a circle with 𝕊


1 1 2 2
= {(𝑥, 𝑦) ∈ ℝ |𝑥 + 𝑦 = 1}) 2
Differential drive (i.e. unicycle) ℝ 2
SE(2)
in 3D: ℳ = SE(3) ≡ ℝ3 × SO(3) Car (1 steering angle) ℝ2 SE(2) × SO(1)
(SO(3) the 3D rotation group) Space Rover (i.e. rough terrain; 6 steering angles) ℝ2 SE(2) × SO(6)
Tractor-Trailer (steering + trailer angles) ℝ2 SE(2) × SO(2)
Robot posture vector 𝜉 ∈ ℳ corresponding to its position and orientation. Airplane, submarine, satellite, etc. ℝ3 SE(3)
Legged robot (exple. HRP2: 30 joints) ℝ2 SE(2) × SO(30)
For wheeled mobile robots, often:
2D: 𝜉 = {𝑥, 𝑦, 𝜃}, 3D: 𝜉 = {𝑥, 𝑦, 𝑧, 𝜃, 𝜙, 𝜓}, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 19 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 20 / 66

 Informal defınıtıons  Informal defınıtıons

Degrees of freedom (DoF) Holonomic system: constraints can be integrated;


For a single joint: The pose of a robot is not constrained by its velocity
∘ Number of independent directions of motion; The robot can instantaneously move in any direction in the workspace (2D/3D)
∘ Knee, elbow: 1; Ankle: 2; etc. They are omnidirectional robots:
For a manipulator (sequence of joint): ∘ The constraint is the robot velocity;
∘ Sum of all the DoF for each joint. ∘ The velocity 𝑣(𝑡) can be integrated if there exists a trajectory 𝑠(𝑡) that is only dependent on the
For a moving vehicle without joints posture, i.e.: 𝑠(𝜉)
∘ Possible directions of motion 𝜕𝑠 𝜕𝑠 𝜕𝑠
↪Equivalently: 𝑑𝑠 =
𝑑𝑥 + 𝑑𝑦 + 𝑑𝜃
∘ Locomotive:1. Unicycle: 3. Helicopter: 6… 𝜕𝑥 𝜕𝑦 𝜕𝜃
For a moving vehicle with joints: sum them all Non-holonomic system: constraints cannot be integrated:
∘ Car: 4, Car+trailer:4, Space rover:9+, etc. The constraint is the robot velocity: 𝑣(𝑡) is NOT integrable;
∘ Its posture is constraint by the velocity (e.g. not in all directions);
The maneuverability of a mobile robot is the combination:
There is no trajectory 𝑠(𝑡) that only depends on the posture;
of the mobility available based on the sliding constraints,
There is a constraint on the velocity (directions) that the robot can reach;
plus additional freedom contributed by the steering.
∘ The robot cannot instantaneously move in every direction in the workspace;

© 2019 – 2020, David FOLIO Advanced Robotics 21 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 22 / 66
 Robot Kınematıcs  Kınematıc Model

Robot manipulator: Robot speed (𝜉)̇ as a function of inputs 𝑢 (e.g. wheel speed and steering)
Fixed to the environment (i.e. fixed base) Forward kinematics: 𝜉 ̇ = 𝑓(𝑞, u)
Mobile robot: Inverse kinematics: u = 𝑔(𝑞, 𝜉)̇
Not fixed to its environment; ↪ Required for motion control, motion planning
Posses locomotion: Non-holonomic robots:
∘ Ability to move from one place to another place; differential equations are not integrable to the final position;
∘ It depends on the environment (e.g. ground, air, water, etc.) the measure of the traveled distance is not sufficient to calculate its final position;
∘ It is hard to imitate nature… the temporal evolution of the motion must also be known.
Kinematics Objective this is in stark contrast to actuator arms
mobile robots non-holonomic constraints:
Description of mechanical behavior of the robot for design and control;
∘ in mobile robotics differential (inverse) kinematics is used;
Similar to robot manipulator kinematics; ∘ transformation between velocities instead of positions;
However, mobile robots move unbound wrt. their environment: To understand the mobile robot motion (kinematics) the constraintst imposed by the
No direct (i.e. instantaneous) way to measure (esp. its position),
locomotion system (e.g. wheels) need to be analyzed.
Position must be integrated over time,
Leads to inaccuracies in position (motion) estimate
↪ one of main challenge in mobile robotics.

© 2019 – 2020, David FOLIO Advanced Robotics 23 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 24 / 66

 Kınematıcs: the dıfferentıal drıve case  Kınematıcs: the dıfferentıal drıve case

Representing the robot within an arbitrary initial frame Forward Kinematic:


y0
In 2D, i.e. C-Space: 𝒞 = SE(2) yR what are the inputs (i.e. control variables)?
xR ∘ It relates the motors (wheels) with the Cartesian space (position and orientation)
Reference frame: ℱ0 ∶ {𝑂, 𝑥0 , 𝑦0 }
y P θ y0 ICR
Robot frame: ℱ𝑅 ∶ {𝑂, 𝑥𝑅 , 𝑦𝑅 } ∘ 𝜑̇ 𝑟 (𝑡) and 𝜑̇ 𝑙 (𝑡) left and right wheels angular velocities; R
yR
Robot posture: 𝜉0 = (𝑥, 𝑦, 𝜃)𝑡 ∘ 𝑣𝑟 (𝑡) and 𝑣𝑙 (𝑡) left and right wheels linear velocities; xR

x0 ∘ Robot velocities (in ℱ𝑅 ): y P θ


O x r
1 𝑟 2l
Mapping between the two frames 𝜔(𝑡) = (𝑣 − 𝑣𝑙 ) = (𝜑̇ 𝑟 − 𝜑̇ 𝑙 ) x0
̇ = R(𝜃)𝜉0̇ , 2𝑙 𝑟 2 O x
∘ 𝜉𝑅
1 𝑟
∘ with R an orthogonal rotation matrix: 𝑣(𝑡) = (𝑣𝑟 + 𝑣𝑙 ) = (𝜑̇ 𝑟 + 𝜑̇ 𝑙 ) ∘ 𝑟: radius of each wheel;
2 2
∘ 𝑙: distance between the wheel and 𝑃
cos 𝜃 sin 𝜃 0 Motion control (using geometric approach):
R(𝜃) = ⎛
⎜− sin 𝜃 cos 𝜃 0⎞⎟ ∘ (posture) kinematic model: 𝜉0̇ = C𝜉 (𝑞)u
⎝ 0 0 1⎠
𝑥(𝑡)
̇ cos 𝜃 0
⎛ 𝑣(𝑡)
⎜𝑦(𝑡)̇ ⎞⎟=⎛
⎜ sin 𝜃 0⎞⎟( )
̇ ⎠ ⎝ 0 𝜔(𝑡)
⎝ 𝜃(𝑡) 1⎠

∘ (configuration) kinematic model: 𝑞 ̇ = C𝑞 (𝑞)u


© 2019 – 2020, David FOLIO Advanced Robotics 25 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 26 / 66
 Wheeled Mobıle Robots  Wheeled Mobıle Robots

Wheels are the most appropriate basic solution for most applications: Different arrangements of wheels
energetically efficient; good balance;
e.g. simple mechanical implementation and easy to control; Nb. Wheels
Basic wheels types
z a. b. c. d.
a) Standard wheel (2DoF): rotation around the
2
(actuated) wheel axis and the contact point (if y

steered);
b) Castor wheel (3DoF): rotation around the
y 3
x
wheel, the castor axis, and the contact point; ...
c) Swedish wheel (3DoF): rotation around the 4
(actuated) wheel, the rollers axis and the
contact point
3 wheels are sufficient to guarantee stability:
d) Ball or spherical wheel: suspension
with more than 3 wheels stability improved,
technically not solved.
but an appropriate suspension is required (e.g. hyperstatic system);
Bigger wheels allow overcoming higher obstacles, but require more torque;
Most arrangements are non-holonomic;
Combining actuation and steering on one wheel makes the design complex and adds
implies control issues.
additional errors for odometry.
Selection of wheels depends on the application.
© 2019 – 2020, David FOLIO Advanced Robotics 27 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 28 / 66

 Wheeled Mobıle Robots  Wheel kınematıc constraınts

Idealized rolling wheel A typical wheel cannot achieve any motion in the C-space:
z
φ r
If the wheel is free to rotate about its axis (x-axis), the robot Rolling without slipping:
y
exhibits preferencial rolling motion in one direction (y-axis) 𝑣𝐶 = 𝑉𝐴 + 𝜑̇ ∧ ⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗⃗
𝐴𝐶 = 0⃗
and a certain amount of lateral slip. vC
C Fixed standard wheel case:
For low velocities, pure rolling is a reasonable wheel model. ground

β x
y No vertical axis of rotation → No steering;
yR β φ
Basic simplification:
Angle Wheel/chassis, 𝛽, fixed;
robot chassis
r
𝑣𝑥 = 0 A vC
Velocity of the wheel: 𝑣 = ⎛
⎜𝑣𝑦 = 𝑟𝜑̇ ⎞
⎟ l vy C
Vertical plane for the wheel; θ
Wheel connected by rigid frame (chassis); ⎝ 𝑣𝑧 = 0 ⎠ α
P xR
Single point of contact (between wheel and ground);
No friction for rotation around contact point; Rolling constraint:
Pure rolling: 𝑣𝐶 = 0;
No slipping, skidding nor sliding; (sin(𝛼 + 𝛽) − cos(𝛼 + 𝛽) −𝑙 cos 𝛽) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0
Not deformable: constant shape, etc.
No sliding constraint:

(cos(𝛼 + 𝛽) sin(𝛼 + 𝛽) 𝑙 sin 𝛽) R(𝜃)𝜉0̇ = 0


© 2019 – 2020, David FOLIO Advanced Robotics 29 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 30 / 66
 Wheel kınematıc constraınts  Wheel kınematıc constraınts

Steered wheel case: Castor wheel case:


yR β(t) β yR β(t)
Steering: angle wheel/chassis, 𝛽(𝑡), actuated; off-centered orientable wheel B
𝑣𝑥 = 0 φ e.g. free (passive) rotation 𝜑̇ and 𝛽;̇ d φ
Velocity of the wheel: 𝑣 = ⎛
⎜𝑣𝑦 = 𝑟𝜑̇ ⎞

robot chassis
A r omnidirectional wheel robot chassis
A
B
𝑣 = 0 l vy C
vC l d
⎝ 𝑧 ⎠ θ α θ α
P xR P xR
Rolling constraint: Rolling constraint:

(sin(𝛼 + 𝛽(𝑡)) − cos(𝛼 + 𝛽(𝑡)) −𝑙 cos 𝛽(𝑡)) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0 (sin(𝛼 + 𝛽) − cos(𝛼 + 𝛽) −𝑙 cos 𝛽) R(𝜃)𝜉0̇ − 𝑟𝜑̇ = 0

No sliding constraint: No sliding constraint:

(cos(𝛼 + 𝛽(𝑡)) sin(𝛼 + 𝛽(𝑡)) 𝑙 sin 𝛽(𝑡)) R(𝜃)𝜉0̇ = 0 (cos(𝛼 + 𝛽) sin(𝛼 + 𝛽) 𝑑 + 𝑙 sin 𝛽(𝑡)) R(𝜃)𝜉0̇ + 𝑑𝛽 ̇ = 0

© 2019 – 2020, David FOLIO Advanced Robotics 31 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 32 / 66

 Wheel kınematıc constraınts  Robot kınematıc constraınts

Swedish wheel case: Given a robot with 𝑚 wheels:


yR β γ each wheel imposes zero or more constraints on the robot motion;
omnidirectional wheel; φsw ∘ only fixed and steerable standard wheels impose constraints;
rotation 𝜑̇ (active);
robot chassis
rsw ∘ castor, swedish and spherical wheels impose no kinematic constraints on the robot chassis:
rotation of the small wheel 𝜑̇ 𝑠𝑤 (passive); A
l 𝜉0̇ can change freely.
∘ 𝑟𝑠𝑤 radius of the small roller; vy φ
θ α What is the maneuverability of a robot considering a combination of different wheels?
P xR Combine the constraints that arise from all the wheels based on the placement of them on
the robot chassis;
Rolling constraint:
Example
(sin(𝛼 + 𝛽 + 𝛾) − cos(𝛼 + 𝛽 + 𝛾) −𝑙 cos(𝛽 + 𝛾)) R(𝜃)𝜉0̇ − 𝑟𝜑cos
̇ 𝛾=0 Let consider a robot with a total of 𝑚 = 𝑚𝑓 + 𝑚𝑠 (fixed+steerable) standars wheels, with
𝑡
𝜑(𝑡)) = (𝜑𝑓 , 𝜑𝑠 ) . The equations for the kinematics constraints in matrix forms:
No sliding constraint: J
J1 (𝛽𝑠 )R(𝜃)𝜉0̇ + J2 𝜑̇ = 0, with J1 (𝛽𝑠 ) = ( 1𝑓 ), and
Rolling:
J1𝑓 (𝛽𝑠 )
(cos(𝛼 + 𝛽 + 𝛾) sin(𝛼 + 𝛽 + 𝛾) 𝑙 sin(𝛽 + 𝛾)) R(𝜃)𝜉0̇ −𝑟𝜑̇ sin 𝛾 − 𝑟𝑠𝑤 𝜑̇ 𝑠𝑤 = 0 J2 = 𝑑𝑖𝑎𝑔(𝑟1 , ..., 𝑟𝑛 )
C1𝑓
Slidding: C1 (𝛽𝑠 )R(𝜃)𝜉0̇ = 0, with C1 (𝛽𝑠 ) = ( )
J1𝑓 (𝛽𝑠 )
© 2019 – 2020, David FOLIO Advanced Robotics 33 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 34 / 66
 Kınematıcs: the dıfferentıal drıve case  Towards real mobıle robots

Wheel constraints: y0 ICR


yR R In real world:
1 0 𝑙 𝑟𝑙 0 xR
no planar workspace: rough terrain, obstacles, etc.
⎛1 0 −𝑙⎞ ⎛ 0 𝑟𝑟 ⎞ P θ







𝜉𝑅̇ = ⎜
⎜ ⎟ ( 𝜑̇ 𝑙 )
⎟ y wheel are deformable (e.g. deflated, worn…):
⎜0 1 0⎟ ⎜0 0 ⎟ ⎟ 𝜑̇ 𝑟 ∘ not constant radius 𝑟;
2l r
⎝0 1 0⎠ ⎝0 0 ⎠ wheels can slip, skid, slide:
x0 ∘ need to model the wheel/ground contact;
A𝜉𝑅̇ = B𝜑̇ O x at high speed dynamics becomes important!
left wheel: 𝛼𝑙 = − 𝜋
2 ; 𝛽𝑙 = 𝜋 ;
right wheel: 𝛼𝑟 = − 𝜋2 ; 𝛽𝑟 = 0 ;

Forward kinematics: 𝜉 ̇ = 𝑓(𝜑)̇


𝜉𝑅̇ = A+ B𝜑̇
𝜉0̇ = R+ (𝜃)𝜉𝑅̇ ,
Inverse kinematics: 𝜑̇ = 𝑔(𝜉)̇
𝜑̇ = B+ A𝜉𝑅̇

© 2019 – 2020, David FOLIO Advanced Robotics 35 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 36 / 66

 Introductıon to dynamıc modelıng  Introductıon to legged locomotıon

Euler-Newton formulation: Why legged robots?

Try to imitated nature;


𝑀 (𝑞)𝑞 ̈ + 𝑉 (𝑞, 𝑞)̇ 𝑞 ̇ + 𝐹 (𝑞)̇ + 𝐺(𝑞) + 𝜏𝑑 = 𝐵(𝑞)𝜏 − Λ𝑡 (𝑞)𝜆
They can cross obstacles;
Energetically inefficient: consumes energy, even in the absence of movement.
𝑀 (𝑞): inertia matrix, 𝜏𝑑 : vector of bounded unknown disturbances, They require to overcome a lot of control issues…
𝑉 (𝑞, 𝑞)̇ : centripetal and coriolis matrix, 𝐵(𝑞): input matrix, 𝜏 is the input vector,
𝐹 (𝑞)̇ : surface friction matrix, Λ: matrix associated with the kinematic
𝐺(𝑞): gravitational vector, constraints,

Lagrange formalism:
StarlETH
𝑑 𝜕ℒ 𝑡 𝜕ℒ 𝑡
( ) −( ) = 𝐹 − Λ𝑡 (𝑞)𝜆 With fewer legs With more legs
𝑑𝑡 𝜕𝑞 ̇ 𝜕𝑞 ∘ less stable; ∘ more stable;
∘ require fast control; ∘ require more coordination;
ℒ = 𝑇 − 𝑈 is the Lagrangian function, with e.g 4-6 legs used; Typical structure (at least 3DoF)
∘ 𝑇: the kinetic energy of the system, but 2 legs for humanoid robots… hips θ
φ Knee
abduction
∘ 𝑈: the potential energy of the system bending

𝐹 generalized force vector ψ


Hips
© 2019 – 2020, David FOLIO Advanced Robotics 37 / 66 © 2019 – 2020, David FOLIO Advanced Robotics bending 38 / 66
 Perceıve the envıronment

Perception is the organization, identification, and interpretation of sensory information in


order to represent and understand the presented information, or the environment.

Sensory perception is the “immediate” perception that the senses provide, like
III. Sensors and Actuators direct information.
Different level of perception:
global/local perception;
self-perception…

Robotic perception refers to the ability to collect, process and format information useful
to the robot to act and react in the world around it.

The autonomy of robots strongly rely on its capability to perceive efficiently its
environment: to perceive robot use sensors
robot environment can be:
unstructured: indoor, outdoor, road, etc.
static/dynamic, etc.
e.g. several sensors → redundancy of information;
© 2019 – 2020, David FOLIO Advanced Robotics 39 / 66 sensors choice, data processing,Advanced
© 2019 – 2020, David FOLIO knowledge
Roboticsrepresentation, etc. 41 / 66

 What ıs sensıng!  Dıfferent sensors

Physical Output
change Transducer signal
Sensor (measurand) (measure)
Classifying sensors by:
A sensor is a device whose purpose is to detect events or changes Type of information (analog, digital…)
in its environment and send the information to other system. It Physical principle (resistive, capacitive…)
transforms the state of an observed physical quantity (mesurand) in Amount of information (bandwidth)
a signal that are used by the system: the measure. Low and high reading (dynamic range, resolution…)
Absolute vs. derivative, digital vs. analog…
∃ a relationship (i.e. a functional) between the measure and the measurand.
Accuracy and precision, etc.
Understanding the physical principle behind sensors enable:
Functional classification:
To properly select the sensors for a given application;
How?
To properly model the sensor system, e.g. resolution, bandwidth, uncertainties
∘ Passive sensors: Measure energy coming from the environment;
∘ Active sensors: emit their proper energy and measure the reaction;
What?
∘ Proprioceptive sensors (internal): measure the state of the system (robot)
e.g. motor speed, wheel load, heading of the robot, battery status, etc.
∘ Exteroceptive sensors (external): information from the robots environment,
e.g. distances to objects, intensity of the ambient light, unique features, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 42 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 43 / 66
 Robotıc sensıng  Sensors for Robotıcs

Robotic peculiarities (e.g. embedded system): Example: Wifibot, by Nexter Robotics https://www.wifibot.com
data volume and processing;
2 odometers; 1 GPS;
power consumption;
4 IR sensors; 1 IMU;
dimension/mass;
1 Laser; 1 camera
life span, robustness, cost, etc.
Important design issues:
What kind of sensors to use? Where to place them?
What are the sources of error? noises? disturbances? Example: Pepper, by Softbank Robotics https://www.softbankrobotics.com
«What are we measuring?» vs. «What do we really want to know?»
2 US sensors; 1 HD camera;
6 Laser; 3 touch sensors;
3 bumpers; 4 microphones;
2 gyro;

© 2019 – 2020, David FOLIO Advanced Robotics 44 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 45 / 66

 Rotary encoder  Headıng Sensors

Convert angular position to output signals;


Use case: wheel/motor encoder Heading sensors determine the robot’s orientation and inclination wrt. a given
measure position or speed of the wheels or steering (proprioceptive reference.
sensors);
proprioceptive types: e.g. gyroscope, accelerometer;
integrate wheel movements to get an estimate of the position
→ odometry.
exteroceptive types: e.g. compass, inclinometer;
Figure 5: Incremental
typical resolutions: 64 - 2048 increments per revolution. encoder Allow (with velocity information) integrating the movement to a position estimate.
Working principle: This procedure is called deduced reckoning (ship navigation)
Incremental encoders (low cost):
∘ regular: counts the number of transitions,
but cannot tell the direction of motion…
∘ quadrature: uses two sensors in quadrature-phase shift;
Absolute encoders (more expensive):
∘ A (Gray) code is used to maintain position information;
∘ more complex and expensive… Figure 6: 3bits Gray
code encoder

Pros.: simple, high-frequency, cost effective (e.g. incremental type)…


Cons.: limited resolutions, drift, wheel sliding/skidding…

© 2019 – 2020, David FOLIO Advanced Robotics 46 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 47 / 66
 Headıng Sensors: exteroceptıve types  Headıng Sensors: Gyroscope

Compass provide the direction relative to the geographic cardinal directions Gyroscope provide an absolute measure for the heading of a mobile system wrt. a fixed
frame.
absolute measure of the heading (e.g. wrt. the north);
large variety of solutions: mechanical, magnetic field measure (e.g. Hall-effect), Mains categories:
gyrocompass… Mechanical Gyroscopes: Optical Gyroscopes:
Drawback: ∘ Standard gyro (angle); ∘ Rate gyro (speed)
weakness of the earth field (30μT) ∘ Rate gyro (speed); ∘ very expensive, difficult to miniaturize…
disturbed by magnetic objects or other sources ∘ very expensive, difficult to miniaturize…
sense
bandwidth limitations, MEMS vibrating structure: direction
not suitable for indoor environments for absolute orientation measure Coriolis force z ky
∘ low cost; ω cx kx drive
Inclinometer measures the angles of slope, elevation or depression of an object wrt. ∘ coarser precision m
y direction
gravity’s direction. (but sufficient in robotics)
cy
x
common solutions: accelerometer, liquid capacitive, electrolytic, gas bubble in
liquid, and pendulum.
disturbed by inertia, temperature, vibration…

© 2019 – 2020, David FOLIO Advanced Robotics 48 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 49 / 66

 Accelerometer  Inertıal Measurement Unıt (IMU)

Accelerometers measure all external forces acting upon them, IMU (Inertial Measurement Unit) device that uses measurement systems
…including gravity: (e.g. gyroscopes and accelerometers) to estimate the relative position (x, y, z),
∘ To obtain the inertial acceleration (due to motion alone), the gravity must be subtracted;
orientation (roll, pitch, yaw), velocity, and acceleration of a moving object wrt. an
∘ Conversely, the device’s output will be zero during free fall!
inertial frame.
Acts like a spring–mass–damper system: ax
ω
k Rate gyroscope ∫ Initial v ξ0
𝑓𝑎𝑝𝑝𝑙𝑖𝑒𝑑 = 𝑓𝑖𝑛𝑒𝑟𝑡𝑖𝑎 + 𝑓𝑑𝑎𝑚𝑝𝑖𝑛𝑔 + 𝑓𝑠𝑝𝑟𝑖𝑛𝑔 = 𝑚𝑥̈ + 𝑐𝑥̇ + 𝑘𝑥 values 0
m θ Orientation
c Transform local to Substract
𝑘𝑥 Accelerometer navigation frame gravity ∫ ∫
∘ at steady state: 𝑎𝑎𝑝𝑝𝑙𝑖𝑒𝑑 =
𝑚
Measure only linear acceleration along a single axis Acceleration Velocity Posture
∘ Omnidirectional accelerometer: 3 accelerometers in 3 orthogonal directions
Figure 7: Basic principle
Main characteristics:
∘ bandwidth: up to 50kHz; ∘ disturbed by temperature, vibration…
∘ accelerations up to 50g Initial value of velocity, position and orientation should be known;
Common applications: ∫ ⇔ strongly subject to drift!
∘ Dynamic acceleration, ∘ Airbag sensors (±35g), need an external reference (e.g. GPS, vision, etc.) to correct drift errors.
∘ Static acceleration (inclinometer), ∘ Control of video games (Wii), smartphone, etc.

© 2019 – 2020, David FOLIO Advanced Robotics 50 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 51 / 66
 Range sensors  Range sensors: Proxımıty Sensors

Rangefinder measures distance from the observer to a target, in a process called Ultrasonic Sensor
ranging. Basic principle: emit an US pulse wave (20kHz to >2MHz)

Principle: measure the traveled distance of a sound or electromagnetic wave, Main characteristics:
basically given by: ∘ Sensitivity to air density: 𝑐 = √𝛾𝑅𝑇 /𝑀
∘ 𝛾: heat capacity ratio (exple.: air 𝛾 = 1.4);
𝑑 =𝑐⋅𝑡 Propagation speed of sound: c≅0.3m/ms
∘ 𝑅: the gas constant (8.314 J/(mol.K)); measurement cone
Propagation speed of EM. signals: c≅0.3m/ns
∘ 𝑀: molar mass of the gas (exple.: air 𝑀 = 0.028kg/mol);
The quality of time of flight (ToF) range sensors manly depends on: ∘ 𝑇: the temperature.
Uncertainties about the exact time of arrival of the reflected signal; ∘ Sound beam propagates in a cone ~±20°;
Inaccuracies in the ToF measure (laser range sensors); ∘ Precision influenced by angle to object;
Opening angle of transmitted beam (US range sensors); ∘ 𝑑𝑈𝑆 < 𝑑𝑟𝑒𝑎𝑙 : proximity/obstacle detection
Amplitude (dB)

Interaction with the target (surface, specular reflections, etc.);


Light sensors:
Variation of propagation speed 𝑐;
Motion of mobile robot/target (if not at stand still)… Basic principle: a collimated beam (e.g. focused IR, laser, etc.) is transmitted toward the target.
Different groups: Different methods: Main characteristics:
Proximity sensors: sonar (US), IR, etc.; direct pulse measure, ∘ Sensitivity to ambient condition (e.g. temperature, light), specular surface, reflection…
Laser/LIDAR rangefinder; wave modulation, ∘ Short distance <2m;
ToF Camera, etc. hybrid methods… ∘ Simple, compact, low cost…
© 2019 – 2020, David FOLIO Advanced Robotics 52 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 53 / 66

 Range sensors: Laser/LIDAR  Range sensors: Laser/LIDAR

Main characteristics:
LASER acronym for «Light Amplification by Stimulated Emission of Radiation», is a
A mechanical mechanism with a mirror sweeps: 2D/3D measurements;
device that emits light through a process of optical amplification. ∘ limited angular range: e.g. 100°, 180°, 270° → blind spot!
∘ cumbersome, fragile, expensive…
LIDAR acronym for «LIght Detection And Ranging», is a method that measures distance
Good stability/precision, long range (up to 10m, 100m…)
to a target by illuminating the target with laser light, and measuring the reflected
light with a sensor.
Relected light
or
irr
M
g

Transmitted light
tin
ta
Ro

Laser

Detector

Operating Principles:
Pulsed laser (today the standard);
Phase shift measurement;

© 2019 – 2020, David FOLIO Advanced Robotics 54 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 55 / 66
 Geolocatıon  Geolocatıon: Satellıte navıgatıon prıncıple

Geolocation characterization of the real-world geographic location of an object. Working principle: simple positioning beacon system (i.e. triangulation)
Satellite navigation system that uses satellites to provide autonomous geo-spatial S2 r2(ts2)
satellite send signals: orbital location (ephemeris) +
positioning. A satellite navigation system with global coverage is termed a global time; r1(ts1)
navigation satellite system (GNSS). receivers computes its location through trilateration and
her
e S1
osp m)
Geostationary
Earth orbit
Graveyard orbit
(GEO+300 km) Operational GNSS: time correction received them with delay: r3(ts3) Ion 800k
0 - ρ2 ρ1
Orbital 20h (6
period
as of 2018: 𝜌 = (𝑡𝑟 − 𝑡𝑠 ) ⋅ 𝑐 S3 ρ3
∘ United States’ Global Positioning System (GPS): 31 ⇝ 𝜌 ≈ ||r𝑠 (𝑡𝑠 ) − r𝑟 (𝑡𝑟 )|| +
𝜔𝑒
𝑐 (𝑥𝑠 𝑦𝑟 − 𝑦 𝑠 𝑥𝑟 )
15h
Galileo Beidou
GPS (COMPASS) rr1(tr1) S4
10h
GLONASS
sat., accuracy 5m. (𝜔𝑒 Earth rotation angle velocity) rr2(tr2)
rr3(tr3)
∘ Russia’s GLONASS: 24 sat., accuracy 7.4-4.5m. ∘ 𝜌: is the pseudorange (i.e. the «pseudo-distance»)
5h
scheduled in 2020: between satellite/receivers
00k
m
0k
m
0k
m
Iridium

0k
m
Hubble
0k
m
0 0k
m ∘ China’s BeiDou Navigation Satellite System (BDS): ∘ At least 3 satellites required for position calculation;
00 00 00
0 00 00 00
30 10 10 2
∘ In practice: at least 4 used for clock synchronization
4 2

Height above
23 (35) sat., accuracy 10m (0.1m)
Radius of orbit

ISS
sea level
∘ European Union’s Galileo: resolution;
Orbital
25000 km/h
22 (28) sat., accuracy 1m (0.01m).
speed 20000 km/h

© 2019 – 2020, David FOLIO Advanced Robotics 56 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 57 / 66

 Geolocatıon: Satellıte navıgatıon  Others sensors…

Main characteristics: Miscellaneous sensors can be used:


Frequency: 5Hz; touch/tactile sensors (e.g. H/R interactions);
Nominal accuracy: 1-5m hearing/microphone sensors (e.g. H/R interactions);
∘ Error sources: Ephemeris data errors, tropospheric delays, unmodeled ionosphere delays, multipath… force/torque sensors (e.g. manipulation);
Higher accuracy: GNSS enhancement proximity (i.e. capacitive) sensors (e.g. micro/nano-robotic);
∘ Satellite-based augmentation system (SBAS): use of additional satellite-broadcast messages + barometric/pressure sensors (e.g. aerial/underwater robots);
reference stations. Gas/odor/chemical sensors, temperature sensors…
e.g. WAAS (Norh America), EGNOS (EU), GAGAN (India), SNAS (China), etc.
∘ and also vision system:
DGPS: use a static receiver at known exact position;
∘ camera, webcam, microscope, etc.
A-GPS: use stationary GPS receiver + (A-GPS) data server;
(very) rich information;
Only for outdoor applications!
∘ Satellites/signals must be accessible… specific image processing methods…

© 2019 – 2020, David FOLIO Advanced Robotics 58 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 59 / 66
 Act/ınteract wıth the envıronment  Dıfferent actuators

A robot must be able to interact physically with the Classifying actuators by types: Functional classification:

Software
Cognition
SENSING ACTION
environment in which it is operating Hydraulics/Pneumatics: All actuators need a power source.
∘ based on fluid/air pressure: pressure changes, Active: power consumption.

Hardware
Sensors Actuators
and actuator moves. Passive: no power consumption.
∘ powerful and precise, but large and dangerous. ∘ Uses potential energy to interact with

world
Real
ENVIRONMENT Chemically reactive materials environment.
∘ Respond to chemicals reactions.
Effector a device that makes impact/influence on environment, (Electric) motors (most common):
i.e. legs, wheels, arms, fingers, etc. ∘ Affordable and simple;
∘ Uses electric current (simple to control);
Actuator a component that transform an energy into a physical phenomenon which ∘ Well suited for wheels.
change the behavior or state of a system.

component by which a system acts upon an environment.


enables effector to perform actions.
(common) examples:
mechanics: motors, hydraulic cylinder, etc.
thermal: thermometer, heating resistor, etc.
light: lighting, LED, screen…

© 2019 – 2020, David FOLIO Advanced Robotics 60 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 61 / 66

 Motors  DC Motors

Motor a system designed to convert one form of energy (e.g. electrical) into mechanical Direct Current (DC) Motors
energy. Simple, inexpensive, easy to find and use; Coil Commutator
Needs DC electrical power to run, Rotor Magnet
Electric motors are the most common source of torque for mobility and/or ∘ need a constant voltage in proper range
manipulation in robotics. Variety of sizes and packaging: N S
spinning at some speed: Ω or 𝑛, ∘ low voltage means low power (smaller motors),
with some amount of torque: 𝑇. I
∘ high voltage means high power Brush
Stator
Transducer: 𝑖 ⋅ 𝑣 = 𝑇 ⋅ Ω (wear and tear occur faster).
+ Vm
Main characteristics: Brushed motors: provide electric current to
easy to control: accurate servo control, the rotor (with brushes+commutators) Speed: 𝐸𝑏 = 𝐾𝑏 (Φ)Ω
excellent efficiency, ∘ the brushes wear down and require replacement ∘ 𝐸𝑏 : induced or counter-EMF,
from mW to MW, Brushless DC motors: synchronous motors ∘ 𝐾𝑏 : counter-EMF constant,
mainly rotating, but also linear ones are available, powered by DC current Torque: 𝑇𝑚 = 𝐾𝑖 (Φ)𝐼𝑎 (e.g. 𝐾𝑖 = 𝐾𝑏 )
∘ common velocities: 1000-10000rpm. ∘ more expensive, but more reliable… Combined equations of motion:
several types (DC, brushless, AC synchronous/asynchronous, etc.),
main issue: autonomous power source (reloading)… 𝑑𝑖
𝐿𝑤𝑖𝑛𝑑. + 𝑅𝑤𝑖𝑛𝑑. 𝑖 + 𝐾𝑏 Ω = 𝑉𝑚
𝑑𝑡
𝑑Ω
𝐽𝑟𝑜𝑡. + 𝑘𝑓𝑟𝑖𝑐𝑡. Ω = 𝐾𝑖 𝑖
© 2019 – 2020, David FOLIO Advanced Robotics 62 / 66 © 2019 – 2020, David FOLIO Advanced Robotics
𝑑𝑡 63 / 66
 Servo-Motors  References

Servo-Motors Motors that can turn their shaft to


Encoder

a specific position. Brush


Adams, Martin David (1999). Sensor modelling, design and data processing for
e.g. made from DC motors by adding:
Ironless winding
Housing
(magnetic return) .
autonomous navigation. Vol. 13. World Scientific.
gear reduction, Borenstein, J., H. R. Everett, and L. Feng (1996). ”Where am I?” Sensors and methods
Commutator

position (encoder) sensor, Magnet for mobile robot positioning. Tech. rep. University of Michigan.
Shaft
electronic circuit to tell directions much to Ball bearing
Rotor pinion Corke, Peter (2017). Robotics, vision and control: fundamental algorithms in
Planet carrier plate

turn. Gear MATLAB®. 2nd ed. Vol. 118. Springer. ısbn: 9783319544137.
Ball bearing

Output shaft Craig, J.J. (2018). Introduction to Robotics: Mechanics and Control. Pearson. ısbn:
Motor Loading Motors apply torque in response to loading. 978-0-13-348979-8.
Everett, HR (1995). Sensors for mobile robots. AK Peters/CRC Press. ısbn:
The higher the load on the output: 978-1-4398-6348-0.
the more the motor will “fight back” with an opposing torque;
Lynch, Kevin M and Frank C Park (2017). Modern Robotics. Cambridge University
the more current the motor draws;
Press. ısbn: 9781107156302.
increasing the load, the motor may stops spinning or stalls.
Murphy, R. (2000). Introduction to AI Robotics. Ed. by R.C. Arkin. A Bradford book. MIT
Press. ısbn: 9780262133838.

© 2019 – 2020, David FOLIO Advanced Robotics 64 / 66 © 2019 – 2020, David FOLIO Advanced Robotics 65 / 66

 References

Russell, S. and P. Norvig (2016). Artificial Intelligence: A Modern Approach. Always


learning. Pearson. ısbn: 9781292153964.
Siegwart, Roland, Illah Reza Nourbakhsh, and Davide Scaramuzza (2011). Introduction
to autonomous mobile robots. 2nd ed. MIT press.

© 2019 – 2020, David FOLIO Advanced Robotics 66 / 66

You might also like