0% found this document useful (0 votes)
14 views33 pages

Robot Vision Lecture 3

This document provides an overview of the CS453/CE453 - Robotic Vision course. The course covers topics related to autonomous mobile robotics including localization, path planning, perception, sensors, and uncertainty representation. It discusses different types of sensors used in robotic vision like cameras, laser scanners, wheel encoders, gyroscopes, accelerometers and more. It also covers challenges in robotic perception and classifications of sensors.

Uploaded by

umerirfan727
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
14 views33 pages

Robot Vision Lecture 3

This document provides an overview of the CS453/CE453 - Robotic Vision course. The course covers topics related to autonomous mobile robotics including localization, path planning, perception, sensors, and uncertainty representation. It discusses different types of sensors used in robotic vision like cameras, laser scanners, wheel encoders, gyroscopes, accelerometers and more. It also covers challenges in robotic perception and classifications of sensors.

Uploaded by

umerirfan727
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPT, PDF, TXT or read online on Scribd
You are on page 1/ 33

CS453/CE453 - Robotic Vision

Robotic Vision
CS453/CE453
Instructor: Ahmar Rashid

Dr. Ahmar Rashid, FCSE, GIKI


CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Localize Path Planning
 on the basis of environment Find the best way for the given
mission

Move
See by generating
wheel actions
Think
Act
Perceive
the environment Roland Siegwart et al. ETH Zurich

Dr. Ahmar Rashid, FCSE, GIKI


CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Perception: Definition

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Perception: Challenges
• Dealing with real-world situations
• Reasoning about a situation
• Cognitive systems have to interpret situations
based on uncertain and only partially available
information
• They need ways to learn
functional and contextual information
(semantics / understanding)

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Sensors Davide Scaramuzza et al. ETH Zurich
• Tactile sensors or bumpers
– Detection of physical contact, security switches
• GPS
– Global localization and navigation
• Inertial Measurement Unit (IMU)
– Orientation and acceleration of the robot
• Wheel encoders
– Local motion estimation (odometry)
• Laser scanners
– Obstacle avoidance, motion estimation, scene
interpretation (road detection, pedestrians)
• Cameras
– Texture information, motion estimation, scene
interpretation

Dr. Ahmar Rashid, FCSE, GIKI


CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Classification of sensors
• What:
– Proprioceptive sensors
• measure values internally to the system (robot),
• e.g. motor speed, wheel load, heading of the robot, battery
status
– Exteroceptive sensors
• information from the robots environment
• distances to objects, intensity of the ambient light, extraction
of features from the environment

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Classification of sensors
• How:
– Passive sensors
• Measure energy coming from the environment; very much
influenced by the environment
– Active sensors
• emit their proper energy and measure the reaction
• better performance, but some influence on environment

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Classification of sensors

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Classification of sensors

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Uncertainty Representation
• Sensing is always related to uncertainties
– How can uncertainty be represented or quantified?
– How do they propagate - uncertainty of a function of uncertain
values?
• Systematic errors (deterministic)
– They are caused by factors or processes that can in theory be
modeled and, thus, calibrated, (e.g., poor calibration of laser range
finder, the diameters of the robot wheels, the distance between the
wheels, etc.)
• Random errors
– They cannot be predicted using a sophisticated model but can only
be described in probabilistic terms (spurious range finder errors,
black level noise in a camera)
Davide Scaramuzza et al. ETH Zurich
Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Uncertainty Representation
• The density function identifies for each possible 𝑥 value of a
probability 𝑓(𝑥) density along the 𝑦-axis
• The area under the curve is 1, indicating the complete chance of
having some value

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Encoder : definition
• Electro-mechanical device that converts linear or angular position of a shaft to an analog or
digital signal, making it an linear/angular transducer. It can be used
– to count the number turns of wheels of ground mobile robot
– to measure the position of a joint of a robot
– in a computer mouse

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Heading sensors
• Definition:
– Heading sensors are sensors that determine the robot’s orientation and
inclination with respect to a given reference
• Examples:
– gyroscope, accelerometer (proprioceptive)
– compass, inclinometer (Exteroceptive)
• Together with an appropriate velocity information, they allow
integrating the movement to a position estimate.
• This procedure is called deduced reckoning/dead reckoning (often
used for ship navigation)

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Heading sensors
• Heading sensor types:
– Compass: senses the absolute direction of the Earth
magnetic field
– Gyroscope: senses the relative orientation of the
robot with respect to a given reference

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


• Mechanical gyroscopes Gyroscopes
– It relies on the inertial properties of a fast-spinning rotor.
• The property is known as the gyroscopic precession.
– If you try to rotate a fast-spinning wheel around its
vertical axis, you will feel a harsh reaction in the horizontal axis.
• This is due to the angular momentum associated with a spinning wheel and
will keep the axis of the gyroscope inertially stable.
– The reactive torque τ and thus the tracking stability with the inertial frame
are proportional to the spinning speed ω, the precession speed Ω , and the
wheel’s inertia I.
τ = IωΩ
– No torque can be transmitted from the outer pivot to the wheel axis
however, friction in the axes generates drift Quality: 0.1° drift in 6 hours
(a high quality mech. gyro costs up to 100,000 $)

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Gyroscopes
• Optical Gyroscopes
– Uses two monochromatic laser beams travelling in an
optical fiber in two opposite directions
– They work on the principle that the speed of light
remains unchanged and, therefore, geometric change
can cause light to take a varying amount of time to reach
its destination.
– One laser beam is sent traveling clockwise through a
fiber while the other travels counterclockwise.
– Because the laser traveling in the direction of rotation
has a slightly shorter path, it will have a higher
frequency. Light
source
– The difference in frequency ∆f of the two beams is a
proportional to the angular velocity Ω of the cylinder detector

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Gyroscopes

Dr. Ahmar Rashid, FCSE, GIKI


CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Gyroscopes

Dr. Ahmar Rashid, FCSE, GIKI


CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Accelerometer
• Measures all external forces acting upon
them (including gravity)
• Acts like a spring-damper system
• To obtain inertial acceleration (due to
motion alone), gravity must be subtracted
https://en.wikipedia.org/wiki/File:Pendular_accel.svg

Accelerometer

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


MEMS accelerometer
• Micro Electro-Mechanical Systems (MEMS)
• A spring-like structure connects the device to a seismic mass
vibrating in a capacity divider that converts the displacement
into an electric signal
• Can be multi-directional
• Can sense up to 50 g
• Applications
– Dynamic acceleration
MEMS
– Static acceleration (inclinometer)
Accelerometer
– Airbag sensors (+- 35 g)
– Control of video games (e.g., Nintendo Wii)

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Inertial Measurement Unit (IMU)
• It uses gyroscopes and accelerometers to estimate the relative
position (x, y, z), orientation (roll, pitch, yaw), velocity, and
acceleration of a moving vehicle with respect to an inertial frame

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Inertial Measurement Unit (IMU)
• It uses gyroscopes and accelerometers to estimate the relative
position (x, y, z), orientation (roll, pitch, yaw), velocity, and
acceleration of a moving vehicle with respect to an inertial frame
• In order to estimate the motion, the gravity vector must be
subtracted and the initial velocity has to be known
• After long periods of operation, drifts occurs: need
external reference to cancel it

v0 x0

a a-g v = v0 + x = x0
∫(a – g)dt + ∫vdt

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Range Sensors
• Sonar
• Laser range finder
• Time of flight camera
• Structured light

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


GPS
• Every satellite transmits its position and time
• Receiver measures time difference of satellite signals
• Calculate position by intersecting distances (pseudoranges)

Jürgen Sturm :Autonomous Navigation for Flying Robots.


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


GPS
• 24+ satellites, 12 hour orbit, 20.190 km height
• 6 orbital planes, 4+ satellites per orbit, 60deg distance
• Satellite transmits orbital location (almanach) + time
• 50bits/s, msg has 1500 bits  12.5 minutes

Jürgen Sturm :Autonomous Navigation for Flying Robots.


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Range Sensors
• Large range distance measurement  thus called range sensors
• Range information:
– key element for localization and environment modeling
• Ultrasonic sensors as well as laser range sensors make use of
propagation speed of sound or electromagnetic waves respectively.
• The traveled distance of a sound or electromagnetic wave is given by

d=c.t
d = distance traveled (usually round-trip)
c = speed of wave propagation
t = time of flight.

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Ultrasound Range Sensors
• Emit signal to determine distance along a ray
– Make use of propagation speed of ultrasound
– Traveled distance is given by speed of sound (v=340m/s)

Jürgen Sturm :Autonomous Navigation for Flying Robots.


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Range Sensors
• It is important to point out
– Propagation speed v of sound: 0.3 m/ms
– Propagation speed v of of electromagnetic signals: 0.3 m/ns,
– Electromagnetic signals travel one million times faster
– 3 meters
• Equivalent to 10 ms for an ultrasonic system
• Equivalent to only 10 ns for a laser range sensor
• Measuring time of flight with electromagnetic signals is not an easy task
• laser range sensors expensive and delicate

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Range Sensors
• The quality of time-of-flight range sensors mainly depends
on:
– Inaccuracies in the time of fight measurement (laser range
sensors)
– Opening angle of transmitted beam (especially ultrasonic range
sensors)
– Interaction with the target (surface, specular reflections)
– Variation of propagation speed (sound)
– Speed of mobile robot and target (if not at stand still)

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Structured light
• Eliminates the correspondence problem by projecting structured
light on the scene
• Slits of light or emit collimated light (possibly laser) by means of a
rotating mirror
• Light perceived by camera
• Range to an illuminated point can then be determined from simple
geometry

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Structured light

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Structured light

x
d = 1 + 1
tan α tan β

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI
CS453/CE453 - Robotic Vision

Autonomous Mobile Robotics


Structured light: Kinect Sensor

• Major components
– R Projector
– IR Camera
– VGA Camera
– Microphone Array
– Motorized Tilt RGB IR IR Laser
Camera Camera Projector

Davide Scaramuzza et al. ETH Zurich


Dr. Ahmar Rashid, FCSE, GIKI

You might also like