E Puck Based Testing

Download as pdf or txt
Download as pdf or txt
You are on page 1of 8

1

Abstract Robots are now playing an important


part in day to day life. Robots are now helping
in all sectors of life. The robot, an
electromechanical object faces many problem,
of those very basic problem is the object
identification and obstacle avoidance.
This project is study of colour based object
identification and obstacle avoidance which is
going to take the shape of face identification
and target acquisition in its advance stages.
The task performed during this project are
worked through a small robot E-puck, Matlab is
used to perform image processing and epic2
software is used to control the e puck.
Epuck camera is used to scan a small
controlled environment by giving a full circular
rotation of 360 and taking picture at an
instance of every 15degree approximately.
The target picture is going to be match by an
image processing techniques using matlab.
When the target is identified. Epuck will
approach the target. E-puck is equipped with
eight sensor that help him to scan his
environment, with help of those sensors E-puck
approaches target in the presence of obstacles.
A brief introduction to e puck, colour space,
image detection techniques, obstacle
avoidance technique, and use of epic2 software
is also the part of this report. The report also
includes the experimental arrangement, the
flow charts explains the tasks performed in step
by step manners. Discussion at the end
conclude the report.
I. Introduction to Epuck
Epuck is a robot with simple mechanical design
used for research and study. Epuck takes birth
in the hands of Dr Francesco Mondada, later it
is being adopted by GCTronic S..r.l.
(Switzerland). Epuck is equipped with micro
controller processor dsPIC (digital signal
programmable integrated processor). Epuck
detail of sensors and actuator is given in the
table below. The etymology of E puck is E for
electronics and puck is because its shape is like
the hard rubber used in ice hockey. The
information on the devices that are being used
in this project.
Stepper Motor A electrical motor that breaks
up full rotation in to number of steps. The
stepper motor used in e puck has 1000 steps.
Two stepper motors are used with fixed wheels
in Epuck can move one rotation per second.
The rotation of these motor is independent and
synchronized.

LED (light emitting diode).A device that
converts the electrical energy into light. Led are
mainly used in Epuck to represent its state or to
illuminate environment.

IR (infra-red) sensors An IR sensor produce
infrared light and measures incident reflected
light fall on it. The use of eight IR sensors made
E puck aware of its environment, and nearby
obstacles.
Table 1 Features and technical information [1]
Features Technical information
Size, weight
70 mm diameter, 55 mm height, 150
g
Battery
autonomy
5Wh LiION rechargeable and
removable battery providing about 3
hours autonomy
Processor
dsPIC 30F6014A @ 60 MHz (~15
MIPS) 16 bit microcontroller with
DSP core
Memory RAM: 8 KB; FLASH: 144 KB
Motors
2 stepper motors with a 50:1
reduction gear, resolution: 0.13 mm
Speed Max: 15 cm/s
Mechanical
structure
Transparent plastic body supporting
PCBs, battery and motors
IR sensors
8 infra-red sensors measuring
ambient light and proximity of
objects up to 6 cm
Camera
VGA colour camera with resolution
of 480x640 (typical use: 52x39 or
480x1)
Microphones
3 omni-directional microphones for
sound localization
Accelerometer
3D accelerometer along the X, Y
and Z axis
LEDs
8 independent red LEDs on the ring,
green LEDs in the body, 1 strong red
LED in front
Speaker
On-board speaker capable of WAV
and tone sound playback
Switch
16 position rotating switch on the top
of the robot
PC connection Standard serial port up to 115 kbps
Wireless
Bluetooth for robot-computer and
robot-robot wireless communication
Remote control
Infra-red receiver for standard
remote control commands
Programming
C programming with free GNU GCC
compiler. Graphical IDE (integrated
development environment) provided
in Webots
Robot vision and obstacle avoidance
Muhammad Salman
Latrobe University
Email:[email protected]
2

II. Literature Review
A. Colour space
The colour is the identification of visible
spectrum of electromagnetic radiation with the
help of HVS(human visual system). The use of
colour space in image processing is crucial as
the image quality varies for different colour
spaces. In [2] author makes the classification of
colour space in three categories
HSV based colour space RGB(red
green blue) colour space, phenomenal
colour space are the example of the
hsv based colour space
Application specific colour space
CMY(k) (cyan Magenta yellow black)
is used for paper printing
CIE (International commission on
illumination) colour space
(CIEXYZ) is formulated by CIE
RGB colour Space the photo receptors in
human eyes are sensitive to three colour which
are red green and blue scientifically this
belongs to large wave length (L), Medium
wavelength, and short wavelength. The
concept of L,M,S wavelengths is used by most
of the image processing devices. The visible
light spectrum starts from 300 nm and ends at
830 nm. The mathematical representation of
R,G, and B is
= ()()()
830
300
(1)
= ()()()
830
300
(2)
= ()()()
830
300
(3)
Where S() represented light spectrum, R(),
B() and G () are the photo sensors sensitivity
function. The RGB space is a cube with red (R),
green (G) and blue (b) on three axis. Black
colour on this cube lies at the corner with (0,0,0)
RGB and white at (255,255,255) RGB.
Phenomenal colour space this colour space
uses three attribute of describing the colours
hue, saturation, and brightness
HUE The ability of colour capturing
device or organ perception according
to which the area appears similar to
one or to the combination of red green
yellow or blue
Saturation The colourfulness of an area
relative to its brightness
Brightness The ability of colour
capturing device or organ by which
anything under observation shows
more or less light
HSV (hue saturation value) and HSI (hue
saturation Intensity) is the cylindrical coordinate
representation of points in RGB colour space.
H (hue) factor is the angle around central
vertical axis of cylinder, S (saturation) is
distance from central axis and height
corresponds to third value.
RGB and HSV colour space are device
dependent more over the hue has
discontinuities around 360[1].
B. E puck based projects
Jianfan Wang use e-puck to create
communication around multiple E puck through
IR sensors. The Epuck Robots makes a
formation and pass through the narrow obstacle
[3]. Another very interesting Project is Bee is
inspired foraging on Epuck, three Epuck black
(home location) blue (food finder) and red (food
advertiser) are used. the blue robot search the
infrared signals of red, make a straight path to
black after searching red robot [4].One of the
closely related to my project is the E puck color
vision based search and clean operation. the
epuck use to search red objects in arena and
take that object back to collection point after
searching red object, this project was made by
scentro" Sheffield Centre for Robotics [5].
C. Colour detection techniques.
Object of different colour can be identified
based on various method used in image
processing. Of those different methods the
edge based image detection is one of them.it is
being represented by the researcher that using
a colour space or grey space for detection of
edges has no difference for nine out of ten times
[6]. Anil et al in [6] presents image detection
using four steps as smoothing to suppress
noise, separate the red, green and blue channel
of the image and then apply directional
operators to enhance edges, setting automatic
threshold by normalizing the intensity of pixels
and using the only high intensity pixels value as
a threshold value, and edge thinning to remove
the unwanted edge information.
Dutta et al in [7] presents an image detection
techniques based on the same four principal as
explained above but the difference is the
selection of filter for noise suppression and
edge thinning and threshold method is based
3

on selecting the maximum colour difference for
each pixel and then taking the average of all the
maximum values. Niu, L., Li et al in [8] uses hsv
colour space for edge detection, using the
colour properties hue edge, saturation edge
and value edges are computed, directional
information measurements are performed after
orthogonalization of hue, using directional
information to get the threshold value to detect
edges.
D. Obstacle avoidance techniques
Mobile robots encounter a very basic problem
during their motion is obstacle avoidance. Of
many available methods given by researcher
few methods are presented and compare here
in [9] by zohaib et al as bug algorithm, artificial
potential field, vector field histogram, follow the
gap and new hybrid navigation algorithm.
In artificial potential field the robot and
obstacles are of same charges and target is of
opposite charge, the resultant force on the
robot is the vector sum of all the repulsive
forces and the force of attraction, resultant force
is the function of distance, being a goal oriented
algorithm it has a limitation of overcoming
symmetric and u shape obstacles. Artificial
potential field will stop working in case of the
sum of two repulsive force is equal to the force
of attraction of the target.

Figure 1Symmetric obstacle potential field F
ref1 and F ref2 are the vectors representing
force of repulsion and Fattr is vector
representing force of attraction [8]
Vector field histogram is three stage
process, in first stage the 2-D histogram is used
to indicate the presence of obstacle, the 2-D
histogram is converted to 1-D and later it takes
the shape of polar histogram, this polar
histogram is used to get direction of robot and
velocity based on low polar obstacle density [9].
The bug algorithm has three variant as
explained in [8]. The bug-1 algorithm when
robot encounter the obstacle it start moving
around it until complete the full circle around it,
during the motion it calculate the minimum
distance from the leaving point to destination
and generate new path to reach the destination
from the previously calculated leaving point.[11]
The Bug 2 algorithm calculates the slope
between source and goal. The robot will follow
the slope until interrupted by obstacle. The
robot will make a circular path around the
obstacle to the point where it finds it slope back
[11].
The adaptive bug algorithm have two different
behaviour [11]
The robot will keep calculate the
minimum distance between the source
and goal throughout its motion
It will follow the boundary of obstacle
The robot will stop following the obstacle when
it finds the minimum distance to next obstacle
or to the goal.
Follow the gap algorithm calculates the gap
between the obstacles. If the calculated gap is
such that the robot can pass through that
obstacle, the robot will follow the gap angle to
pass through the obstacle
New hybrid navigation algorithm as
explained in [8] based on two layers
deliberative and reactive. The deliberate layer
plans the path on the basis of prior information,
given as binary grid map. The binary grid map
marks obstacle as occupied and free for no
obstacle and unknown places. The reactive
layer takes the path from deliberate layer and is
independent of following that path. Reactive
layer controls the motion of the robot decide to
avoid obstacle with percept of sensors.
E. Epic2
The epic2 software is toolbox to interface
Matlab with e-puck the following set of
command and their possible effects are
explained by the developer in [10]. The class
epic contains all the information regarding the
variable related to e-puck control. This class is
created by calling (epic= ePicKernel;). Before
using e-puck connect function must be called to
connect e-puck. All the interaction of e-puck is
centralized in one command epic=update(epic);
to activate reading of a sensor the command
4

epic=activate (epic, propName); must be
called, where propName is the name of the
sensor to activate. To deactivate e puck,
command is epic=deactivate (epic,propname);.
The information from the e-puck can be
retrieved by [val,up]=get (epic, propname);.
Table 2 gives the complete set of propname
and their use.
To get the image form camera the command
[epic, mode, sizexy]=updateImage(epic); is
used in matlab. To compute the odometry of the
robot [epic]=updateOdometry(epic); and to get
the position of e-puck, function [pos,up]
=get(epic,odom); is used.

Table 2 Prop name and mat lab function
Prop name Data Supported
command
Accel Accelerometer get, activate,
deactivate
Proxy Proximity
sensors
get, activate,
deactivate
Light Light sensors get, activate,
deactivate
Micro Microphone get, activate,
deactivate
Floor Floor sensor get, activate,
deactivate
Speed Motor speeds get, activate,
deactivate
Pos Encoders get, activate,
deactivate
odmo Odometry
position
get, activate,
deactivate
image camera image get, activate,
deactivate
custom Custom
command
get, activate,
deactivate
connectionState Connection
status
Get

Table 3Parameter and their properties
PropName Arguments Description
Speed [right_motor,left_motor] Change the
motor speed
ledOn [led_number] Light on at that
led number
ledOff [led_number] Light off at that
led number
Odom [x y theta] Set the
current
position used
by odometry
camMode [mode] Set the
camera mode
0 for gray
scale and 1
for colour
camSize [width height] Set the width
and size of the
camera
camZoom [zoom] Set the zoom
factor (1,4,8)
Function to reset the internal variable of the
odometry is epic=reset(epic,odom). The
properties of e-puck are controlled by the
command epic=set(epic,varargin);, where
varargin is set of parameter begin with property
to modify followed by vector. The table 3 enlist
all such property name, argument and their
description.
III. Implementation
A. Colour detection
Matlab is used to perform image processing
and epic2 software is used to control all the
sensors and actuators in e puck. After studying
different colour detection techniques a simplest
model for colour detection is devised.
Figure 2 colour detection flow chart
5

The distinct feature and working of the method
is given below in the form of flowchart.
Initializing Epuck and Establishing Connection
In order to communicate to Epuck the bluetooth
USB adapter bluetooth class 1 and bluetooth
driver is used. Connection is establish between
the E puck and the computer using software
epic2 working in the Matlab. The table 4 gives
the command and their functionality.
Table 4 Initializing Epuck
Matlab command Description
epic=ePicKernel; Creates a class
containing all
variable to
control e puck
epic=connection(epic,COMXX); Connect e puck
to accessible
serial port
Image is get through camera by using set of
commands given in table 5.

Table 5Image File generation
Matlab command Description
epic= updateDef(epic,
'image',1);
This command set
the image capturing
frerquency
[epic, mode, sizexy]=
updateImage(epic);
To retrieve a large
value data this
command is used to
[picture,up] = get(epic,'image');
To get the image
from the class

The image obtained from epuck is small and 90
out of phase, the image is resized and reshaped
before applying further image processing.
The image is read in to RGB (red green blue)
colour space but to perform colour detection I
converted it into the HSV (hue saturation value)
colour space. The syntax of the Matlab
command is hsvimage=rgb2hsv(rgbimage)
[13].The output of this command is a m-by-n-
by-3 image array whose three planes are hue
saturation and value. Three image replica are
reproduced depending on the HSV values for
red colour set by threshold in such a way that
any value less then maximum threshold and
minimum or equal to lower threshold is
considered as target.
I found the small presence of noise in the form
of presence of red colour other than the original
object. This may result in false detection, such
small area of noise are removed from the
images. The Matlab function bwareaopen
removes small objects from the images [13].
The output from colour detection are the area of
the red object in image, the average value of
the red pixels and the position of centroid of red
object. The value of area of the red object,
mean intensity of the red pixels present in the
image, and centroid is computed using the
regionprop function in matlab [13].
B. Target Identification and Movement.
Search Algorithm used in this project is scan the
environment, Focus target and move towards
target.
Epuck is two wheel robot having capability to
rotate 360 without changing its axis of rotation.
The first criteria is to detect any presence of red
colour, the value of area and average
calculated by colour detection are used as a
reference.
If the colour is present and the object centre is
too left or right further refinement is performed
by rotating Epuck depending on the value of off
set.


Figure 3Move towards Target
Once the target is acquired, then I need to
move the e puck to move towards and stop in
front of it. Epuck is equipped with sensors that
help him to scan nearby object, with the help of
these sensors I control the movement of e puck.
Epuck will continue to move forward until it finds
any obstacle in its way. As the Epuck is
focusing towards target it will stop at target.
6

C. Obstacle avoidance
In the presence of obstacle that is hiding the
target. The search algorithm is search and
change your position. Every time e puck
completes its 360 search it will move at angle
approximately 45 until stopped by an obstacle.

Figure 4Obstacle avoidance
IV. Experiment
The figures in this section shows all the
experimental equipment used for the
verification of proposed algorithm.

Figure 5Epuck used in the experiment

Figure 6Target object of red colour

Figure 7Experimantal ground for Robot eye senstive to
red colour

Figure 8Experimental ground for obstacle avoidance
V. Discussion
Initially I used RGB colour space for image
processing, but I found that the RGB colour
space results vary much with light conditions.
To overcome this problem the white balance
technique is also used but it suppress the colour
details. Later I choose to work with HSV colour
space which I found quite useful in real time
image processing. Experiments are performed
many time to improve the output of the image
processing algorithm and to make its output use
as a metric for identification.
The proposed image processing technique
works well under different light conditions.
While working with obstacle avoidance and
keeping in mind the complexity of the different
algorithm studied. The devised method is
simplest of all.i found this method not much
robust but I found it better than bug algorithm.
One of the major constrain of this proposed
algorithm is that it can only work in a controlled
environment.
7

Further improvements are required to
overcome the near far effect in this technique.


Figure 9 Near far effect
Epuck move and stop function based on the
front four IR sensors value. If any of the sensor
is getting something on his way while e puck
has already search the red object as shown in
figure 9. Epuck will stop moving.
An improvement can be made to this problem
by introducing distance to target.
Acknowledgment
I am sincerely grateful to my supervisor Dr.
Denis Dang and co-supervisor Andrew
Martchenko for their enthusiastic support and
continuous involvement in my project, providing
new ideas, group discussions, supplying
resources, providing tutorials, and working on
different idea to finalize my project
I would like to thank our laboratory manager,
Mr. Peter Stewart for the overall support and
providing me the good conditions for doing the
project.
I also would like to thank to my friend Mohsin ud
din pathan for helping me out.
Finally I will thank my family members for
cooperation while performing home works for
this project.
References
[1]HTTP://EN.WIKIBOOKS.ORG/WIKI/CYBERBOTICS_ROBOT_CU
RRICULUM/E-PUCK_AND_WEBOTS

[2] M. Tkalcic and J. F. Tasic, Colour spaces - perceptual,
historical and applicational background, EUROCON 2003.
Computer as a Tool. The IEEE Region 8, vol. 1, pp. 304
308, September 2003.
[3]http://www.youtube.com/watch?v=IzDET9j74tE
[4]http://www.youtube.com/user/SwarmLabMaastricht
[5]http://www.youtube.com/watch?v=GS4P62phwyQ
[6] Anil Kamboj, Kavita Grewal, Ruchi Mittal colour Edge
Detection in RGB Colour Space Using Automatic Threshold
Detection (IJITEE) ISSN: 2278-3075, Volume-1, Issue-3,
August 2012
[7]S.Dutta & B. B. Chaudhuri A Colour Edge Detection
Algorithm in RGB Colour Space Soumya Dutta Department
of Electronics and Communication 2009 International
Conference on Advances in Recent Technologies in
Communication and Computing 337-340
[8] Niu, L., Li, W.: Colour Edge Detection Based on
Direction Information Measure. In: 6th World Congress on
Intelligent Control and Automation, pp. 95339536. IEEE
Press,China (2006)
[9] Zohaib et al.,2013 Control Strategies for Mobile Robot
With Obstacle Avoidance available at
http://arxiv.org/ftp/arxiv/papers/1306/1306.1144.pdf
[10] Johann Borenstein, Member, IEEE, and Yoram Koren,
Senior Member, IEEE, The Vector Field Histogram-Fast
Obstacle Avoidance for Mobile Robots IEEE
TRANSACTIONS ON ROBOTICS AND AUTOMATION,
VOL. 7, NO. 3, JUNE 1991
[11] ePic2 v2.1 Documentation Julien Hubert & Yannick
Weibel November 5, 2008 Laboratory of Intelligent Systems
located in the Ecole Polytechnique Fdrale de Lausanne
in Switzerland
[12] Yufka, A., & Parlaktuna, O. (2009, May). Performance
comparison of bug algorithms for mobile robots. In
Proceedings of the 5th international advanced technologies
symposium, Karabuk, Turkey.

[13] http://www.mathworks.com.au/support/

8

You might also like