Robot Based 3D Scanning and Recognition of Workpieces: International Master's Thesis
Robot Based 3D Scanning and Recognition of Workpieces: International Master's Thesis
Robot Based 3D Scanning and Recognition of Workpieces: International Master's Thesis
Robin Reicher
Technology
Robin Reicher
Examiners: Dr.
Dr.
MSc.
© Robin Reicher, 2011
7
Acknowledgements
Firstly, I would like to thank my supervisor Prof. Ivan Kalaykov for his in-
volvement throughout my masters thesis as his ideas and advice have been very
useful. Dr. Boyko Iliev has helped me understand the robotic manipulator while
Dr. Dimitar Dimitrov has given me the theoretical background needed to con-
trol the manipulator and I am very thankful for their advice and understand-
ing, thank you! Lastly, a quick thank you to my classmates who have helped
on many occasions from math to encouragement, I very much appreciated your
assistance.
9
Contents
Appendix 12
1 Introduction 17
1.1 Background . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17
1.2 Goals . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
1.3 Thesis Outline . . . . . . . . . . . . . . . . . . . . . . . . . . . . 20
11
12 CONTENTS
5 Conclusion 51
5.1 Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51
5.2 Future Improvements . . . . . . . . . . . . . . . . . . . . . . . . 52
A C/C++ 55
B RAPID 61
C Matlab 67
D Manual 81
List of Figures
3.1 The general structure of two first parts. Recognition and scanning. 28
3.2 The IRB 140 in standing position we are working with. . . . . . 30
3.3 Taking consecutive pictures while moving camera. . . . . . . . . 32
3.4 The measuring process. . . . . . . . . . . . . . . . . . . . . . . . 34
3.5 Movement of the scanner around a box workpiece. . . . . . . . 36
3.6 Normals. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38
3.7 Different styles of spliting polygons. . . . . . . . . . . . . . . . . 39
13
List of Tables
15
Chapter 1
Introduction
1.1 Background
Nearly every industry has quality inspection procedures of their products which
may be different depending on the product. Some companies use conventional
manpower on an assembly line or in a dedicated quality control booth while
others strive to use state-of-the-art inspection techniques. Much effort has been
made to develop reliable, cost efficient, precise solutions. The main reason com-
panies do not get a renewed contract is due to a lack of product quality, espe-
cially work awarded by government agencies1 thereby making quality control
a vital step of the production process.
In this project we will look at different ways to solve the quality inspection
problem. A methodology for solving this type of problem will be proposed to-
gether with a demonstration of the implemented system.
We have chosen to use a micro-graph system to detect workpiece surface de-
fects. This allows us to not only find irregularities of the workpiece shape but
also defects on the surface itself. By keeping the correct distance and angle to
the surface, the reflections on the workpiece surface are kept to a minimum,
significantly simplifying the inspection of chromium and other highly reflective
surfaces.
The entire surface of the workpiece is to be inspected autonomously and will
therefore be held by a robotic manipulator which sweeps the piece in front of
a stationary mounted camera. Micro-graph inspection has a very limited depth
of field, therefore the distance between camera and workpiece needs to be held
with millimeter precision. If the distance is not kept constant for the duration
of the inspection, images will become blurry and the surface could be mistaken
to be smoother than it really is. In order to move the workpiece in front of the
camera in an intelligent, efficient and precise way, some prior knowledge about
the workpiece surface profile is needed.
We want to be able to inspect an unknown free-form workpiece without any
1 Quality Assurance Series, http://www.opm.gov/fedclass/gs1910.pdf
17
18 CHAPTER 1. INTRODUCTION
To scan workpieces that have already been scanned is not desirable, therefore
a third step next to scanning and inspection is necessary. The recognition step
will be performed before the two other steps and begins by capturing a series of
pictures of the workpiece. With the help of these pictures, a classifier will deter-
mine if a piece has been scanned before or not. To make this feasible, a simple
database structure holds the data needed for training a classifier. This database
will also hold all surface profiles, normals, images and other information about
2 http://www.youtube.com/watch?v=xbgvctYXx3g&t=3m20s
1.1. BACKGROUND 19
Figure 1.2: Project general structure and how it is divided between the two
authors.
This thesis will deal with the recognition and scanning. The proposed method-
ology for these two will be explained together with an implementation of a
working system. Inspection will be investigated by Syed ZilleHussnain in his
master thesis "Robot-based Micrography of free-form Workpieces for indus-
trial inspection"[1] that will be completed around the end of 2011.
20 CHAPTER 1. INTRODUCTION
1.2 Goals
The goal of this thesis is to implement a system cabable of recognize and scan
free-form workpieces without any prior knowledge about them. It is however
not to have a product ready for industrial use. The project consists of various
fields such as machine learning, image processing and mechatronics. The goal
is both to find a good concept for a methodology but also to implement and
test the system.
Chapter 3 is going through our idea both conceptually and how it was imple-
mented in C/C++, MATLAB and RAPID.
Chapter 4 presents the results we found when experimenting.
Chapter 5 contains our conclusions and some ideas for future improvments.
Chapter 2
Methodology Related to the
Project
2.1 Scanning
To virtually represent an object is in many fields very practical or even crucial.
Therefore many different ways of measuring an object dimensions and shape
have been proposed.
21
22 CHAPTER 2. METHODOLOGY RELATED TO THE PROJECT
The Exascan 3-D scanner has been used in some projects before. Du Zhiqiang
et al. scanned a complex Buddha sculpture in the article "Detail-preservation
2.2. OBJECT RECOGNITION BASED ON 2-D IMAGES 23
Figure 2.2: Setup used in the Rahayem and Kjellander’s "Quadric segmentation
and fitting of data captured by a laser profile scanner mounted on an industrial
robot".
2.3 Summary
There are no papers to be found regarding the entire process we describe in
this project. An abundance of papers exist that addresses either Recognition,
Scanning or Inspection. A sound methodology and implementation of a system
that addresses all three problems is non-existent today.
CMMs are better suited for inspection rather than scanning since they have
problems with unknown objects as well as scanning entire surfaces. A 3-D scan-
ner suites our purpose better and has been tested in previous experiments with
positive results[3].
The Exascanner is a good choice for this kind of project because of the focus
on the 3-D model rather than the scanning itself. Surprisingly, there are hardly
any papers using this scanner, this thesis will hopefully contribute to filling that
gap.
A turntable will not be used as, the physical workspace in front of the robot
is too small to fit a turntable with a reference board. Also, the robot we will
be using is flexible enough too scan the entire object from all sides, except the
bottom.
The first step in our project is taking a few pictures consecutively. While
taking them the robotic manipulator moves the camera in 90◦ arc around the
workpiece, the motion starts from a top-down look of the workpiece moving
along the side until the manipulator base is reached. The focus of the camera
remains steady on the workpiece. These pictures are temporarily saved to be
used for calculating the dimensions.
All picture histogram of oriented gradients[8] is calculated in order to classify
them easier. More about this in section 3.3.
27
28 CHAPTER 3. SYSTEM AND IMPLEMENTATION
Figure 3.1: The general structure of two first parts. Recognition and scanning.
3.2. HARDWARE 29
nipulator with the camera from the photo-set are used to calculate the dimen-
sions of the workpiece. This estimation is rough but sufficient for the next task
which is to create the manipulator motion path for the scanning. This path has
the shape of a spiral put on top of a dome and is described in detail in section
3.5.
To be able to recognize the workpiece the next time, the classifier is retrained
with the new set of graphical primitives.
3.2 Hardware
Three different pieces of hardware will be used. A camera, a robotic manipula-
tor and the previously mentioned 3-D scanner Exascan. The first two of these
are available in the Örebro university Center for Applied Autonomous Sensor
Systems (AASS) lab.
The camera chosen for recognition is the Marlin F033C fire wire camera by
Allied Vision1 . This is a small camera made for an industrial environment and
should suite our r purposes perfectly. 640 x 480 monocrome pictures will be
used in this project together with a 12.5 mm television lens.
The most important component of our system is the 3-D Exascan device,
which is to produce the surface profile of the unknown workpiece. Unfortu-
nately the Exascanner did not arrive to the lab. This sadly makes the implemen-
tation incomplete but since our concept is modular, no other part has been af-
fected. For test purposes in the project, instead of doing an actual scan, a surface
1 http://www.covistech.com/uploads/AVT_Datasheets_Product_Images_App_Notes/MARLIN_F_033B_C.pdf
30 CHAPTER 3. SYSTEM AND IMPLEMENTATION
Figure 3.2: The IRB 140 in standing position we are working with.
For computer algorithms there are mainly five different challenges when
dealing with this type of classification, all needed to be addressed either by pre-
processing the image and/or by training an appropriate classifier.
how the workpiece look in 3-D space and thus being able of rotating it to
see if it might fit. Modern vision classification algorithms have not found
any good general solution to this but combinations of 3-D models and
machine learning is often used.
To classify a workpiece based on vision alone is a task that has been keep-
ing researchers busy for decades. The human brain is very good at this and
can overcome most problems computer algorithms have. One of the biggest
advantage human classification has is the ability to look at the bigger picture
and prior knowledge about how things in general are. When we see a round
workpiece laying on a table we presume that it is a ball and not the round tip of
a long pole. When we do not recognize something we might pick up the work-
piece and turn it a few times to find a feature we recognize, if the workpiece is
too big we simply walk around.
This is exactly how we solve problem number five, rotation around the axis
other than between camera and workpiece.
A few pictures are taken consecutively in an almost 90◦ arc motion to get many
pictures of the same workpiece from different angles. This way we can train the
classifier to be more robust and recognize a workpiece even a few degrees ro-
tated.
In the special case of this thesis all problems except scaling need to be ad-
dressed. This is because the distance between the workpiece and the camera
will always be the same.
3.4. MEASURING THE WORKPIECE 33
After much consideration and some basic testing I decided that the classi-
fication part should be performed by using the SVM technique. A SVM will
be trained for each object, this SVM will determine if the picture provided
characterise the workpiece. Since we will have more than one picture for each
classification, the number of classification will be the number of pictures times
the number of objects plus one for the empty scene. To be able to determine if
a classification was successful a certainty-variable is going to be used. This is
calculated by summing up the best objects classification results and dividing it
by the number of pictures taken. The number one will mean the classifier recog-
nized each image it was given, we will furthermore call this a 100 % certainty.
This measuring is made by using the Marlin camera mounted on the robotic
manipulator. The measuring is done in a series of steps, first a 10x10 median
filter is used together with an equally big Gaussian filter with sigma 2.0 to
remove noise and most shadows (fig. 3.4). To get a binary image we then apply
a canny edge detector with a threshold of 2.0. Last part is to fit all points in a
rectangular box which is done by a MATLAB function called minboundrect2 .
minboundrect creates a box containing the workpiece represented by 4 2-D
coordinates. These coordinates are used to calculate the dimensions of the piece
in millimeters. This is definitely not a precise number. When measuring from
above the pixels per cm ratio we use to translate from pixels into millimeters is
calculated from the table surface, therefore all workpieces higher then a laying
piece of paper will be slightly misscalculated.
The same is true when measuring from the side in order to get the height, the
workpiece is treated like a flat pieces of paper standing up with the broad side
against the camera. If we would underestimate the workpiece size this could
have consequences, mainly hitting the object with the scanner. But since we
always underestimate the width, depth and height of the workpiece, the rough
2 http://www.mathworks.com/matlabcentral/fileexchange/13624-minimal-bounding-rectangle
34 CHAPTER 3. SYSTEM AND IMPLEMENTATION
(a) Raw Image (b) Image smoothed with median and gaussian fil-
ters.
estimation is acceptable. To measure the height is done more or less in the same
way, only difference is that the camera is closer to the table when the photo
used for measuring.
3.5 Scanning
The Exascan is created to be as user-friendly as possible3 and produces a whole
3-D mesh in .stl format. To get around the problem of the absent scanner we
have replaced this step by simply providing a .stl file created in a CAD-program.
It will then be possible to do experiments of the whole system even though it is
somewhat off a best-case-scenario where the scanner works flawless.
The motion of the scanner during the scan is important to achieve a high
quality scan. Several different ideas were discussed. One of the first was simply
moving the scanner in a number of 180◦ degree arcs over the object. This would
scan the object from above much more then from the sides, creating a scan of
uneven quality, the motions would also be unnecessary jerky.
The motion path chosen for the scanner has the shape of a dome with a spiral
wrapped around it (Fig. 3.5) where the scan starts at the top of the dome. This
shapes have two advantages.
• Regularity - The entire trajectory is kept without any interruptions or
sharp motions, like the hand-held scanner is designed for.
• Homogeneous - No part of a generic object is scanned more than any
other part.
r = a + bθ (3.1)
r x 2 y 2
z=b 1− − (3.2)
a a
The trajectory and dome are flexible and can easily be adjusted to fit the
scanner characteristics. Equation 3.2 and are combined to create the shape, the
complete implementation can be seen in (sec. C).
The dome is twice the height and diameter of the measured workpiece to ensure
that the scanner would not hit the piece. There is also a minimum height of
200 mm, the dome lower boarder ends at this level regardless of how high the
workpiece is. If the workpiece is under 200 mm high the trajectory will become
a flat spiral instead to ensure safety. There are many ways to get the scan to be
tighter.
The second technique is to make the facet size smaller or equal to the view-
ing range. To achieve this polygons too big facets are split repeatedly until the
distance between the center and the corner of the facet is smaller than the cam-
era’s viewing range. Many type of slices were tried out, the first one tested (Fig.
3.7a) divided each facet into 4 smaller ones of the same shape. This looked
like a nice idea but turned out to make some triangles unnecessary small and
narrow. The idea finally settled was to simply split the triangle in half (Fig.
3.7b). The longest side of the triangle is calculated and splitted, this way we get
triangles as equilateral as possible, thus simplifying the inspection.
(b) Spliting triangle polygons into two new ones, creating as equilateral
triangles as possible.
Table 3.1: Example of the database used. N is the number of pictures taken for
classification.
x y z
x1 y1 z1
x2 y2 z2
x3 y3 z3
n1_x n1_y n1_z
n2_x n2_y n2_z
Table 3.2: The surfaceInfo matrix, used for inspection. x-y,1-3 are the three
vertices making up the face of the triangle, n1 are the centerpoint and n2 the
point 1 mm away from it, along the normal.
Chapter 4
Experiments and Results
4.1 Overview
The lack of scanner made the experiments a bit more fragmented then initially
planned. The scanner have been replaced by a cardboard box of roughly the
same size and the 3-D mesh that should be created by the scanner is replaced
with a 3-D model created in advance by CAD program.
The recognition part is working but should definitely have more descriptors for
empty scenes or invalid scenes, right now the system has only been tested with
a limited number of examples.
The reason we used acamera to do some of the tasks was to get a fast system.
The height estimator works but is very rough. If we would have a real scanner
there would also be a "reference board" under the workpiece that the scan-
ner needs. This board would need to be subtracted from the images before any
measuring is made, this could making the measuring process more inaccurate.
There are many ways to get around this, one is to attach the robot to the
ceiling and put the workpieces underneath it, more about this in section 5.2. A
simpler solution is to raise the minimum boundary for how low the dome can
be. As it can be seen in figure 3.2 the workspace is widest at height 352 mm. If
all domes lower boarder would be to this height, the dome would be able to be
41
42 CHAPTER 4. EXPERIMENTS AND RESULTS
bigger.
When first working with the robot it was presumed that a few domes would
be predefined and that new workpieces would be enclosed inside the smallest
of the predefined domes. While experimenting with domes we realized that this
was not needed and totally dynamical domes are now being generated, unique
for each object. There are no limits to these domes and if the user puts a too big
workpiece in front of the robot, it will be measured, a dome would be created
for it and then the process would fail while trying to execute the trajectory. No
hard boundary have been added because of two reasons. The first is that this
implementation should work on any robotic manipulator with as few changes
as possible needed to be made. The second reason is that the limitations of this
robots reach is blurry, it depends not only on the robots position but also the
orientation of the TCP and especially what tool is being used.
workpieces on. The object would then be scanned from above and underneath.
An algorithm to remove mesh artifacts created by the transparent reference
board might be needed together with a very dexterous robotic manipulator.
The results from the SVM are very different. SVM’s classification is binary,
either it is the object or it is not. As we can see in 4.2 the SVM succeeded to
classify the objects much more accurate. The only object not being classified
correctly was the battery. Besides that, there were only three incorrect classifi-
44 CHAPTER 4. EXPERIMENTS AND RESULTS
cations. Just like the ANN we did not spend very much time to configure the
SVM, and did not continue through with investigation.
Table 4.2: Result of classifying 7 objects using a support vector machine, opti-
mal result would have been a diagonal line (gray) of ones only.
The first object to scan was the battery (Fig. 4.3). It was classified as the empty
scene with a 100 % certainty since it was the only thing the support vector
machine (SVM) classifier was trained for. The classifier retrains all the old net-
works and adds a new one for every object scanned. This means that the first
few scans most probably are bad, but as the SVM gets more "bad examples"
from other objects, it will get better. We disregarded that the object was classi-
fied as the empty scene and manually changed the "certainty" parameter to 0
%, making the program scan the battery.
When looking at the pictures taken of the battery we realized they were too
dark, so we added a industrial light (left in Fig. 4.3) to brighten them up. We
took new pictures and retrained the SVM for the empty scene and a battery.
The next object to be scanned was a small lamp, it was also classified as the
empty scene and needed manually be added.
Object number four was a small circuit-board. The best classification was 70
% for the empty scene, since we required at least 80 % the object was defined
as unknown and was properly trained for and added to the database.
The small Lamp was put in the classification area again in roughly the same
position as last time, it was classified as a small Lamp with 80 % certainty.
Our next object was a small black cog. As we can see in figure 4.4 the part
of program responsible for measuring objects failed. The combination of a
brighter scene then usually together with a very dark workpieces made the pro-
gram misjudge the size of the cog. This have worked very well up until now on
every object(Fig. 4.5). The cog was also recognized as the empty scene, proba-
bly because of the dark color of the cog.
A small rubber was the second to last object, it got classified as the empty scene
as well. Probably the size of the objects matter. Smaller objects have features
harder to spot and therefore they are harder to classify.
The last object was a lamp of the same shape as the previous lamp, but about
4.5. MAIN EXPERIMENT 47
double in size. The best classification was 30 % for the small cog.
After all the objects had been scanned we tried to classify all the objects
again and again. When the objects were put back in same pose as initially
trained for we had about 70-80 % success rate. The position of the object
had little importance for the results. The rotation had much more importance,
which was unexpected since the HOG-descriptors should be rotation-invariant.
The reason for this is probably that we take pictures from 10 different angles
making the object look very different when rotated.
For testing the surface modification part of the code, a lot of meshes were
tried. If the meshes were big and complex and the camera view range was set
to a low value, a couple of minutes to process was needed. Three of these tests
can be seen in figure 4.6a-4.6b.
48 CHAPTER 4. EXPERIMENTS AND RESULTS
(a) A flat spiral is created for objects with height < 200 mm.
(a) ONE.stl before surface preparation. (b) ONE.stl after surface preparation with
camera maximum view-range of 1 mm.
(a) TWO.stl before surface preparation. (b) TWO.stl after surface preparation with
camera maximum view-range of 0.01 mm.
50 CHAPTER 4. EXPERIMENTS AND RESULTS
(a) THREE.stl before surface preparation. (b) THREE.stl after surface preparation
with camera maximum view-range of 2 mm.
Chapter 5
Conclusion
The main goal of this thesis was to create and test a feasible methodology for
scanning and recognizing workpieces. This have been achieved and the surface
profile needed for surface analysis produced is suitable for the application in
mind.
5.1 Summary
Main contributions
• Generate a suitable robot motion path to enablescanning by a 3-D scan-
ner to be performed.
• Define an algorithm for spliting polygons into manageable pieces for com-
puting the surface normals needed for surface micro scan.
• Design a database prototype that can hold the neccesary data for inspect-
ing free-form objects.
database The structure of the code is made to be robust and flexible, every
block in the flow chart diagram (fig. 1.2) in chapter 3.1 is replaceable.
The robot motion created for the scanner works as intended, one of the more
important objectives of this thesis. The code should work for all ABB robots
using RAPID and automatically scales the domes to fit the workpiece size. The
domes probably needs to be fine-tuned further when tested with an actual 3-D
scanner to take into account its real dimmensions. The code is well commented
and it is easy to modify a few parameters in order to adjust the spiral-dome:
51
52 CHAPTER 5. CONCLUSION
• Bigger robot to get a bigger workspace or to attach the IRB 400to the ceil-
ing (fig. 5.1)1 . If mounted on properly on the correct height the workspace
will be not only bigger but also more homogeneous around the work-
piece.
• Deformable spiral shaped dome in the xy-plane. If the dome would better
fit the object, this would improve the scan quality. This requires a more
advanced measuring system together with some changes to the function
"CreateSpiralDome" (Sec. C).
• 3-D object recognition by using the 3-D scanner together with 2-D im-
ages. Would literally add another dimension to the recognition, making
it much more reliable.
• Create a more robust database structure designed for more objects and
better error handling.
• Unite the different parts of the system better by re-write the code to be
only c/c++. This would also make the recognition part faster and easily
changed.
The code written in C/C++ is for controlling the IRB 140. The first section
abb_advanced.cpp is a short piece of code to tell the robot to execute the com-
mands on the ftp server.
abb_advanced.cpp
#include <iostream >
#include < s t d i o . h>
#include < s t r i n g . h>
#include < t i m e . h>
# i n c l u d e "sockABB . hh"
u s i n g namespace s t d ;
i n t main ( ) {
/ / run code on s e r v e r
char r u n s t u f f [256] = " [0 ,0 ,0 ,0 ,0 ,0]5 " ;
/ / c o n n e c t with a s e r v e r
abbClient . connectSock ( ) ;
abbClient . writeSock ( r u n s t u f f ) ;
sleep (1);
55
56 APPENDIX A. C/C++
/ / c l o s e t h e c o n n e c t i o n with t h e s e r v e r
abbClient . closeSock ( ) ;
}
sockABB.cpp
# i n c l u d e "sockABB . hh"
sockABB : : ~ sockABB ( ) {
i f ( s o c k f d >=0) {
c l o s e ( sockfd ) ;
}
};
v o i d sockABB : : e r r o r S o c k ( c o n s t c h a r * msg ) {
p e r r o r ( msg ) ;
exit (0);
}
v o i d sockABB : : c o n n e c t S o c k ( ) {
i n t portno = 1300;
s t r u c t sockaddr_in serv_addr ;
struct hostent * server ;
57
b z e r o ( ( c h a r * ) &s e r v _ a d d r , s i z e o f ( s e r v _ a d d r ) ) ;
s e r v _ a d d r . s i n _ f a m i l y = AF_INET ;
bcopy ( ( c h a r * ) s e r v e r −>h_addr ,
( c h a r *)& s e r v _ a d d r . s i n _ a d d r . s_addr ,
s e r v e r −>h _ l e n g t h ) ;
s e r v _ a d d r . s i n _ p o r t = h t o n s ( portno ) ;
i f ( c o n n e c t ( sockfd , ( s t r u c t sockaddr *)&s e r v _ a d d r ,
s i z e o f ( serv_addr ) ) < 0) {
e r r o r S o c k ( "ERROR sockABBconnect : c o n n e c t i o n " ) ;
}
c o u t << " sockABBconnect : Connection t o i r b 1 4 0 e s t a b l i s h e d . "
<< e n d l ;
} e l s e i f ( sockfd < 0) {
e r r o r S o c k ( "ERROR sockABBconnect : opening s o c k e t : " ) ;
s o c k f d = −1;
}
}
/ * * \ b r i e f Write t o ABB s o c k e t .
*/
v o i d sockABB : : w r i t e S o c k ( c h a r * pos ) {
i f ( s o c k f d >= 0 ) {
bzero ( buffer , 2 5 6 ) ;
s t r c p y ( b u f f e r , pos ) ;
n = w r i t e ( sockfd , b u f f e r , s t r l e n ( b u f f e r ) ) ;
/* i f ( verbose ) {
c o u t << "sockABBwrite : command : " << pos << e n d l ;
} */
i f ( n < 0) {
e r r o r S o c k ( "ERROR sockABBwrite : " ) ;
}
} else {
c o u t << "ERROR sockABBwrite : can ’ t w r i t e t o i n v a l i d d e s c r i p t o r "
<< e n d l ;
58 APPENDIX A. C/C++
}
}
msgOut−> a s s i g n ( b u f f e r O u t ) ;
/* i f ( verbose ) {
c o u t << "sockABBread : message : " << * msgOut << e n d l ;
} */
} else {
c o u t << "ERROR sockABBread : can ’ t r e a d from i n v a l i d d e s c r i p t o r "
<< e n d l ;
}
}
v o i d sockABB : : c l o s e S o c k ( ) {
c h a r closeMsg [ 2 5 6 ] = " [ C l o s e . ] 0 " ;
int n;
s t r i n g tmp ;
/* i f ( verbose ) {
c o u t << " sockABBclose : C l o s i n g s o c k e t . . . " << e n d l ;
} */
n = w r i t e ( sockfd , closeMsg , s t r l e n ( closeMsg ) ) ;
i f ( n < 0) {
e r r o r S o c k ( "ERROR c l o s i n g t h e s e r v e r ( w r i t i n g t o s o c k e t ) " ) ;
}
/ / t h i s −>readSock(&tmp ) ;
c l o s e ( sockfd ) ;
c o u t << " sockABBclose : Connection t o i r b 1 4 0 c l o s e d . " << e n d l ;
}
w r i t e S o c k ( pos ) ;
readSock ( msgOut ) ;
/ / s t r c p y ( msgOut , msg . c _ s t r ( ) ) ;
/* i f ( verbose ) {
c o u t << "rwSock : Read / Write done . " << e n d l ;
} */
}
}
Appendix B
RAPID
All rapid code is generated by matlab functions and have several parameters.
The code shown below is therefore a example, showing how this generated
code could look.
Arc Trajectory
%%%
VERSION : 1 . 5
LANGUAGE: ENGLISH
%%%
MODULE mod_testcode
PROC movee ( )
CONST r o b t a r g e t S t a r t P o i n t : =
[ [ 5 1 5 . 0 0 0 0 0 0 , 0.000000 , 200.000000] ,
[ 0 . 7 0 7 1 0 7 , 0.000000 , 0.707107 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t C i r c l e P o i n t : =
[ [ 5 1 5 . 0 0 0 0 0 0 , 70.710678 , 170.710678] ,
[ 0 . 6 9 3 5 2 0 , −0.137950 , 0.693520 , −0.137950] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t T a r g e t P o i n t : =
[ [ 5 1 5 . 0 0 0 0 0 0 , 100.000000 , 100.000000] ,
[ 0 . 6 5 3 2 8 1 , −0.270598 , 0.653281 , −0.270598] ,
61
62 APPENDIX B. RAPID
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST j o i n t t a r g e t INIT : =
[ [ 0 . 0 0 0 0 0 0 , 0.000000 , 0.000000 , 0.000000 , 0.000000 , 0 . 0 0 0 0 0 0 ] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
ConfL \ Off ;
MoveJ S t a r t P o i n t , v1000 \ T: = 2 , z10 , t o o l 0 ;
MoveC C i r c l e P o i n t , T a r g e t P o i n t , v1000 \ T: = 5 , z10 , t o o l 0 ;
MoveAbsJ INIT , v200 , f i n e , t o o l 0 ;
ENDPROC
ENDMODULE
Scanner Trajectory
%%%
VERSION : 1 . 5
LANGUAGE: ENGLISH
%%%
MODULE mod_testcode
PROC movee ( )
CONST r o b t a r g e t Target100 : =
[ [ 5 1 5 . 0 0 0 0 0 0 , 0.000000 , 240.000000] ,
[ 0 . 7 0 7 1 0 7 , 0.000000 , 0.707107 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target101 : =
[ [ 5 3 8 . 9 1 7 7 1 4 , 41.426695 , 235.126678] ,
[ 0 . 6 6 8 1 2 9 , −0.061350 , 0.738970 , −0.061350] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target102 : =
[ [ 4 8 6 . 5 5 6 8 8 5 , 49.264920 , 232.897311] ,
[ 0 . 7 4 4 2 8 8 , −0.073176 , 0.659792 , −0.073176] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
63
CONST r o b t a r g e t Target103 : =
[ [ 4 5 2 . 0 4 5 0 3 8 , 0.000000 , 231.078411] ,
[ 0 . 7 9 4 6 2 5 , −0.000000 , 0.607100 , −0.000000] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target104 : =
[ [ 4 8 1 . 1 7 5 2 4 5 , −58.586194 , 229.457885] ,
[ 0 . 7 5 0 3 6 3 , 0.087483 , 0.649345 , 0 . 0 8 7 4 8 3 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target105 : =
[ [ 5 5 0 . 7 6 5 3 2 4 , −61.947358 , 227.952435] ,
[ 0 . 6 4 5 4 0 6 , 0.092742 , 0.752495 , 0 . 0 9 2 7 4 2 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target106 : =
[ [ 5 8 9 . 8 6 6 4 8 9 , −0.000000 , 226.517945] ,
[ 0 . 5 8 5 7 4 1 , 0.000000 , 0.810498 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target107 : =
[ [ 5 5 3 . 9 0 3 9 9 2 , 67.383691 , 225.126616] ,
[ 0 . 6 3 8 7 8 7 , −0.101419 , 0.755896 , −0.101419] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target108 : =
[ [ 4 7 4 . 7 7 5 3 6 1 , 69.671119 , 223.758497] ,
[ 0 . 7 5 7 3 1 5 , −0.105155 , 0.635892 , −0.105155] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target109 : =
[ [ 4 3 2 . 1 4 6 6 1 0 , 0.000000 , 222.397556] ,
[ 0 . 8 2 1 3 1 2 , −0.000000 , 0.570479 , −0.000000] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target110 : =
[ [ 4 7 2 . 4 6 7 6 2 2 , −73.668239 , 221.029455] ,
64 APPENDIX B. RAPID
[ 0 . 7 5 9 7 9 1 , 0.111856 , 0.630630 , 0 . 1 1 1 8 5 6 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target111 : =
[ [ 5 5 8 . 5 5 7 9 9 0 , −75.444652 , 219.639984] ,
[ 0 . 6 2 8 1 8 7 , 0.114930 , 0.760896 , 0 . 1 1 4 9 3 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target112 : =
[ [ 6 0 4 . 0 3 1 7 6 2 , −0.000000 , 218.213601] ,
[ 0 . 5 5 7 7 7 7 , 0.000000 , 0.829991 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target113 : =
[ [ 5 6 0 . 4 1 5 6 4 6 , 78.662207 , 216.731653] ,
[ 0 . 6 2 3 5 3 6 , −0.120717 , 0.762927 , −0.120717] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target114 : =
[ [ 4 6 8 . 7 3 5 0 9 6 , 80.133165 , 215.169625] ,
[ 0 . 7 6 3 8 7 9 , −0.123497 , 0.621277 , −0.123497] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target115 : =
[ [ 4 2 0 . 8 6 0 3 7 4 , 0.000000 , 213.492106] ,
[ 0 . 8 3 7 6 9 6 , −0.000000 , 0.546137 , −0.000000] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target116 : =
[ [ 4 6 7 . 1 6 4 5 7 3 , −82.853390 , 211.641908] ,
[ 0 . 7 6 5 7 2 3 , 0.129017 , 0.616747 , 0 . 1 2 9 0 1 7 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target117 : =
[ [ 5 6 3 . 5 6 5 9 5 0 , −84.118693 , 209.511205] ,
[ 0 . 6 1 4 3 7 4 , 0.131879 , 0.766655 , 0 . 1 3 1 8 7 9 ] ,
[0 , 0 , 0 , 0] ,
65
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target118 : =
[ [ 6 1 3 . 5 2 9 8 4 1 , −0.000000 , 206.833686] ,
[ 0 . 5 3 3 8 2 2 , 0.000000 , 0.845597 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target119 : =
[ [ 6 1 5 . 0 0 0 0 0 0 , −0.000000 , 200.000000] ,
[ 0 . 5 2 5 7 3 1 , 0.000000 , 0.850651 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target120 : =
[ [ 5 6 5 . 0 0 0 0 0 0 , 86.602540 , 200.000000] ,
[ 0 . 6 0 6 9 6 1 , −0.140694 , 0.769421 , −0.140694] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target121 : =
[ [ 4 6 5 . 0 0 0 0 0 0 , 86.602540 , 200.000000] ,
[ 0 . 7 6 9 4 2 1 , −0.140694 , 0.606961 , −0.140694] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target122 : =
[ [ 4 1 5 . 0 0 0 0 0 0 , 0.000000 , 200.000000] ,
[ 0 . 8 5 0 6 5 1 , −0.000000 , 0.525731 , −0.000000] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target123 : =
[ [ 4 6 5 . 0 0 0 0 0 0 , −86.602540 , 200.000000] ,
[ 0 . 7 6 9 4 2 1 , 0.140694 , 0.606961 , 0 . 1 4 0 6 9 4 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST r o b t a r g e t Target124 : =
[ [ 5 6 5 . 0 0 0 0 0 0 , −86.602540 , 200.000000] ,
[ 0 . 6 0 6 9 6 1 , 0.140694 , 0.769421 , 0 . 1 4 0 6 9 4 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
66 APPENDIX B. RAPID
CONST r o b t a r g e t Target125 : =
[ [ 6 1 5 . 0 0 0 0 0 0 , −0.000000 , 200.000000] ,
[ 0 . 5 2 5 7 3 1 , 0.000000 , 0.850651 , 0 . 0 0 0 0 0 0 ] ,
[0 , 0 , 0 , 0] ,
[ 0 , 9 E9 , 9 E9 , 9 E9 , 9 E9 , 9 E9 ] ] ;
CONST j o i n t t a r g e t INIT : =
[ [ 0 . 0 0 0 0 0 0 , 0.000000 , 0.000000 , 0.000000 , 0.000000 , 0 . 0 0 0 0 0 0 ] , [ 0 , 9 E9 ,
ConfL \ Off ;
ENDMODULE
Appendix C
Matlab
OnlyMain.m
clc ;
clear ;
i f ~Found
PhaseTwo ( Tag , dx , dy , dz ) ;
end
PhaseOne.m
f u n c t i o n [ Found Tag dx dy dz ] = PhaseOne ( )
% Get p i c t u r e s
a r c _ t = 1 0 ; %t i m e i t t a k e s f o r t h e a r c
n _ p i c = 1 0 ; %number o f p i c t u r e s t o t a k e ( minimum 1 )
C r e a t e A r c T r a j e c t o r y ( 3 5 0 , 100 , a r c _ t ) ;
s y s t e m ( ’ . / f t p _ a b b . sh ’ ) ;
s y s t e m ( ’ . / RunStuff ’ ) ;
pause ( 0 . 5 ) ;
f o r n = 1 : 1 : n_pic ,
[ Im ( : , : , n ) d t ] = T a k e P i c t u r e ( ) ;
pause ( ( a r c _ t / n _ p i c ) − d t ) ;
end
67
68 APPENDIX C. MATLAB
% make d e s c r i p t o r s
f o r n = 1 : 1 : l e n g t h ( Im ( 1 , 1 , : ) ) ,
H( : , n ) = HOG( Im ( : , : , n ) ) ;
end
% get dimentions
[ dx , dy , dz ] = GetDimensions ( Im ( : , : , 1 ) , Im ( : , : , n _ p i c )
, 0 . 2 , ’ p l o t ’ ) ; %Top
i d = C l a s s i f y O b j e c t (H) ;
%i f not found :
i f ( i d == 0 )
f p r i n t f ( ’New o b j e c t \ n ’ ) ;
Name = i n p u t ( ’ Name : ’ , ’ s ’ ) ;
Tag = i n p u t ( ’ Tag : ’ , ’ s ’ ) ;
Found = 0 ;
P = DomeSpiral ( dz , max ( dx , dy ) , 1 , p i / 3 ) ;
69
P ( : , 1 ) = P ( : , 1 ) + 515;
plot3 ( P ( : , 1) , P ( : , 2) , P ( : , 3))
Q = GetQuadToC ( P ) ;
Speed = 100;
Zone = 2 0 ;
% Maximum r a d i u s o f Camera
CamMax = 0 . 5 ;
figure ;
patch ( x , y , z , ’ r ’ ) ;
axis equal ;
l o a d DB. mat ;
num = l e n g t h (DB ) ;
SurfaceInfo = zeros (5 , 3 , length (x(1 , : ) ) ) ;
end
figure ;
patch ( x , y , z , ’ g ’ ) ;
axis equal ;
end
\ end { }
CreateArcTrajectory.m
function [ ] = CreateArcTrajectory ( radius , safety , varargin )
% C r e a t e s an a r c e n d i n g a t h e i g h t < s a f e t y > and h a v i n g an
radius
% < r a d i u s > , o p t i o n a l p a r a m e t e r s i s t i m e o f movement and
zone o f movement .
% There i s no o u t p u t but and t r a j e c t o r y . mod w i l l be c r e a t e d
i n t h e working
% d i r e c t o r y . The Time i s however a l l w a y s +2 s e c o n d s t o g e t
to the
% s t a r t i n g p o s i t i o n of the arc .
optargin = s i z e ( varargin , 2 ) ;
i f o p t a r g i n == 0
Time = 5 ;
Zone = 1 0 ;
e l s e i f o p t a r g i n == 1
Time = v a r a r g i n { 1 } ;
Zone = 1 0 ;
e l s e i f o p t a r g i n == 2
Time = v a r a r g i n { 1 } ;
Zone = v a r a r g i n { 2 } ;
end
Rot = makehgtform ( ’ x r o t a t e ’ , − p i / 4 ) ;
% open t h e f i l e with w r i t e p e r m i s s i o n
f i d = fopen ( ’ t r a j e c t o r y .MOD’ , ’w ’ ) ;
% I n i t i a l position
f p r i n t f ( fid , ’ CONST j o i n t t a r g e t INIT : = [ [ 0 . 0 0 0 0 0 0 ,
0.000000 , 0.000000 , 0.000000 , 0.000000 , 0 . 0 0 0 0 0 0 ] , [ 0 , 9 E9 , 9 E9 , 9 E9 ,
9E9 , 9 E9 ] ] ; ’ ) ;
fclose ( fid );
% view t h e c o n t e n t s o f t h e f i l e
t y p e t r a j e c t o r y .MOD
end
72 APPENDIX C. MATLAB
GetQuadToC.m
f u n c t i o n [ Q ] = GetQuadToC ( pos )
[ x y ] = s i z e ( pos ) ;
if y > 3
pos ( : , 4 ) = [ ] ; % removes p o t e n t i a l s c a l e −t h i n g y
end
for i = 1: x ,
ROT = makehgtform ( ’ y r o t a t e ’ , p i / 2 ) ; % i n i t i a l r o b o t r o t a t i o n
matrix
% Transform t o Q u a r t i n i o n s
q1 = ( 1 / 2 ) * s q r t (ROT( 1 , 1 ) + ROT( 2 , 2 ) + ROT( 3 , 3 ) + 1 ) ;
q2 = ( 1 / 2 ) * ( s i g n (ROT( 3 , 2 ) − ROT( 2 , 3 ) ) * s q r t (ROT( 1 , 1 ) −
ROT( 2 , 2 ) − ROT( 3 , 3 ) + 1 ) ) ;
q3 = ( 1 / 2 ) * ( s i g n (ROT( 1 , 3 ) − ROT( 3 , 1 ) ) * s q r t (ROT( 2 , 2 ) −
ROT( 1 , 1 ) − ROT( 3 , 3 ) + 1 ) ) ;
q4 = ( 1 / 2 ) * ( s i g n (ROT( 2 , 1 ) − ROT( 1 , 2 ) ) * s q r t (ROT( 3 , 3 ) −
ROT( 1 , 1 ) − ROT( 2 , 2 ) + 1 ) ) ;
73
Q( i , 1 : 4 ) = [ q1 q2 q3 q4 ] ;
Q( i , 1 : 4 ) = Q( i , 1 : 4 ) / norm (Q( i , 1 : 4 ) ) ;
end
end
TakePicture.m
f u n c t i o n [ P i c t u r e Time ] = T a k e P i c t u r e ( )
tic ;
grab_gray_image ( ) ;
Time = t o c ;
end
FixSurface2.m
f u n c t i o n [ OutX OutY OutZ OutN ] = F i x S u r f a c e 2 ( InX , InY , InZ , InN ,
CamMax)
T_N = l e n g t h ( InX ( 1 , : ) ) ;
T r i a n g l e s = z e r o s ( 3 , 4 , T_N ) ;
Distances = zeros (1 , 3);
Check_Triangles = 0;
OutN = z e r o s ( 3 , 2 ) ;
f o r t = 1 : T_N,
T r i a n g l e s ( : , : , t ) = [ InX ( : , t ) InY ( : , t ) InZ ( : , t ) InN ( : , t
end
T r i a n g l e s ( : , : , T_N+ 1 :T_N+2) = T ;
continue ;
end
Check_Triangles = Check_Triangles + 1;
end
end
SplitIt3.m
f u n c t i o n [ Tout ] = S p l i t I t 3 (T)
SidesLength = zeros (1 , 3 ) ;
Lines = zeros (1 , 3 ) ;
L i n e s ( 1 , : ) = T( 1 , 1 : 3 ) − T( 2 , 1 : 3 ) ;
L i n e s ( 2 , : ) = T( 2 , 1 : 3 ) − T( 3 , 1 : 3 ) ;
L i n e s ( 3 , : ) = T( 1 , 1 : 3 ) − T( 3 , 1 : 3 ) ;
S i d e s L e n g t h ( 1 ) = s q r t ( sum ( L i n e s ( 1 , : ) . ^ 2 ) ) ;
S i d e s L e n g t h ( 2 ) = s q r t ( sum ( L i n e s ( 2 , : ) . ^ 2 ) ) ;
S i d e s L e n g t h ( 3 ) = s q r t ( sum ( L i n e s ( 3 , : ) . ^ 2 ) ) ;
end
otherTwo = [ 1 2 3 ] ;
otherTwo ( otherTwo== o p p o s i t e P o i n t ) = [ ] ;
o p p o s i t e P o i n t = T( oppositePoint , 1 : 3 ) ;
%% C r e a t i n g t h e TWO t r i a n g l e s
x = [ T ( otherTwo ( 1 ) , 1 ) T ( otherTwo ( 2 ) , 1 ) ;
SplitPoint (1) SplitPoint (1);
oppositePoint (1) oppositePoint ( 1 ) ] ;
y = [ T ( otherTwo ( 1 ) , 2 ) T ( otherTwo ( 2 ) , 2 ) ;
SplitPoint (2) SplitPoint (2);
oppositePoint (2) oppositePoint ( 2 ) ] ;
z = [ T ( otherTwo ( 1 ) , 3 ) T ( otherTwo ( 2 ) , 3 ) ;
SplitPoint (3) SplitPoint (3);
oppositePoint (3) oppositePoint ( 3 ) ] ;
for t = 1: 2,
Tout ( : , 1, t) = x(: , t );
Tout ( : , 2, t) = y(: , t );
Tout ( : , 3, t) = z (: , t );
Tout ( : , 4, t) = T(: , 4);
end
end
GetDimensions.m
f u n c t i o n [Dx Dy Dz ] = GetDimensions ( ImOldTOP , ImOldSIDE , s , v a r a r g i n
optargin = s i z e ( varargin , 2 ) ;
[ Sx , Sy ] = s i z e ( ImOldTOP ) ;
% X and Y
ImTOP = m e d f i l t 2 ( ImOldTOP , [10 1 0 ] ) ;
G = f s p e c i a l ( ’ gaussian ’ ,[10 10] ,2);
ImTOP = i m f i l t e r ( ImOldTOP , G, ’ symmetric ’ ) ;
ImTOP = edge ( ImTOP, ’ canny ’ , s ) ; %a l s o good s i n c e i t s p e e d s up
%Z
76 APPENDIX C. MATLAB
xVT = [ ] ;
yVT = [ ] ;
xVS = [ ] ;
yVS = [ ] ;
% Below p a r t p o s s i b l e t o make f a s t e r ?
% Below p a r t p o s s i b l e t o make f a s t e r ?
f o r x = 1 : Sx ,
f o r y = 1 : Sy ,
i f ImTOP( x , y ) > 0
xVT( l e n g t h (xVT) + 1 ) = x ;
yVT ( l e n g t h ( yVT ) + 1 ) = y ;
end
i f ImSIDE ( x , y ) > 0
xVS ( l e n g t h ( xVS ) + 1 ) = x ;
yVS ( l e n g t h ( yVS ) + 1 ) = y ;
end
end
end
i f l e n g t h (xVT) == 0 | | l e n g t h ( xVS ) == 0
f p r i n t f ( ’No o b j e c t found t o messaure \ n ’ ) ;
return ;
end
% P i x e l cm r a t i o (TOP) ( s p e c i f i c on i n i t i a l h i g h t )
PcRT = 0 . 3 1 ;
% P i x e l cm r a t i o ( SIDE ) ( s p e c i f i c on i n i t i a l h i g h t )
PcRS = 0 . 0 8 ;
[Dx Dy Dz ]
i f s i z e ( v a r a r g i n , 2 ) && strcmp ( ’ p l o t ’ , v a r a r g i n ( 1 ) )
p l o t ( ryT , rxT , ’ r ’ , ’ LineWidth ’ , 4 ) ;
end
end
ClassifyObject.m
f u n c t i o n [ i d ] = C l a s s i f y O b j e c t (H)
PicNum = l e n g t h (H( 1 , : ) ) ;
% Do we have a network ?
i f e x i s t ( ’DB. mat ’ ) % We have !
l o a d ’DB. mat ’ ;
DBL = l e n g t h (DB ) ;
Pred = z e r o s ( 1 , 1 0 ) ;
f o r o = 1 : l e n g t h (DB) ,
f o r n = 1 : l e n g t h (H( 1 , : ) ) ,
Pred ( o , n ) = s v m c l a s s i f y (DB( o ) . SVM, H( : , n ) ’ ) ;
end
end
DB(DBL ) . Name = [ ] ;
DB(DBL ) . Tag = [ ] ;
DB(DBL ) . Hog = H;
78 APPENDIX C. MATLAB
T r a i n S e t = z e r o s ( 8 1 , PicNum *DBL ) ;
f o r t = 0 : DBL−1,
T r a i n S e t ( : , ( PicNum * t ) + 1 : ( PicNum * t ) + PicNum ) =
DB( t + 1 ) . Hog ;
end
%R e t r a i n / t r a i n networks
f o r n = 0 : DBL−1,
T a r g e t s = z e r o s ( 1 , PicNum * DBL ) ;
T a r g e t s ( ( n * PicNum ) + 1 : ( n * PicNum ) + PicNum ) =
ones ( 1 , PicNum ) ;
DB( n + 1 ) .SVM = s v m t r a i n ( T r a i n S e t ’ , T a r g e t s ) ;
end
s a v e ( ’DB. mat ’ , ’ DB ’ ) ;
end
else
T r a i n S e t = H;
T a r g e t s = ones ( 1 , PicNum ) ;
DB ( 1 ) .SVM = s v m t r a i n ( T r a i n S e t ’ , T a r g e t s ’ ) ;
DB ( 1 ) . Name = [ ] ;
DB ( 1 ) . Tag = [ ] ;
DB ( 1 ) . Hog = H;
s a v e ( ’DB. mat ’ , ’ DB ’ ) ;
id = 0;
end
end
AddToDB.m
f u n c t i o n [ e r r o r ] = AddToDB (Name, Tag , Hog , Img )
cd / home / r e g e n / Dropbox / matlab / Real2
% Loads o l d d a t a b a s e i f t h e r e e x i s t s one
i f e x i s t ( ’DB. mat ’ )
l o a d ( ’DB. mat ’ ) ;
n = l e n g t h (DB ) ;
else
DB = [ ] ;
s a v e ( ’DB. mat ’ , ’DB ’ ) ;
n = 0;
79
end
pause ( 3 ) ;
%check t o s e e i f t h e o b j e c t i s n t a l l r e a d y i n t h e DB
for m = 1: n ,
i f strcmp (Name, DB(m) . Name) | | strcmp ( Tag , DB(m) . Tag )
f p r i n t f ( ’ A l l r e a d y i n t h e Database , a b o r t i n g . \ n ’ ) ;
error = 1;
return ;
end
end
DB( n ) . i d = n ;
DB( n ) . Name = Name ;
DB( n ) . Tag = Tag ;
DB( n ) . Mesh = [ ’ mesh / ’ Tag ’MESH. s t l ’ ] ;
f p r i n t f ( ’ S u c e s s f u l l y added o b j e c t t o Database \ n ’ ) ;
error = 0;
end
DomeSpiral.m
f u n c t i o n [ P ] = DomeSpiral ( h e i g h t , r a d i u s , d i s t , s t e p )
% DomeSpiral
% minimum d i s t a n c e t o t a b l e
minH = 200;
% C o n t r o l l i n g s i z e o f s p h e r e −s p i r a l
i f r a d i u s > MaxR
r a d i u s = MaxR;
end
% i n i t i a l values
P = [ 0 0 max ( minH , h e i g h t ) ] ;
80 APPENDIX C. MATLAB
% Number o f r i n g s
maxS = 2 0 ;
% C r e a t i n g t h e s p r i r a l shaped dome
f o r t = 0 : s t e p : maxS−s t e p ,
n = length (P ( : , 1)) + 1;
% C a r t e s i a n Coords
% t h e ^ ( 1 / 4 ) n e e d s t o be i n v e s t i g a t e d . . .
P ( n , 1 ) = ( t / maxS ) ^ ( 1 / 4 ) * r a d i u s * c o s ( t / d i s t ) ;
P ( n , 2 ) = ( t / maxS ) ^ ( 1 / 4 ) * r a d i u s * s i n ( t / d i s t ) ;
i f h e i g h t > minH
P ( n , 3 ) = ( h e i g h t −minH ) * s q r t ( 1 − ( P ( n , 1 ) / r a d i u s )^2
− ( P ( n , 2 ) / r a d i u s )^2 ) + minH ;
else
P ( n , 3 ) = minH ;
end
% C r e a t i n g t h e e x t r a c i c l e around t h e bottom .
tmp = t / d i s t ; %j u s t t o make t h e c i r c l e s t a r t a t t h e p l a c e
%t h e s p i r a l ended a t
f o r o = t : s t e p : 2* p i + tmp ,
n = length (P ( : , 1)) + 1;
P ( n , 1) = r a d i u s * cos ( o ) ;
P ( n , 2) = rad ius * s i n ( o ) ;
P ( n , 3 ) = minH ;
end
P(1 , : ) = [ ] ;
end
Appendix D
Manual
81
References
83