0% found this document useful (0 votes)
151 views

Midterm Implementation

The document outlines the 10 step process for implementing a midterm robotics project. The robot is designed to follow a black line, circumnavigate an obstacle when it encounters one, and stop on a green square. It will use a light sensor to follow the line, a touch sensor to detect the obstacle, and an ultrasonic sensor to navigate around the obstacle. The design was tested and refined, with the ultrasonic sensor ultimately being placed towards the front of the robot to get more accurate readings. Team members contributed to building, coding, and debugging the robot.

Uploaded by

api-296552281
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
151 views

Midterm Implementation

The document outlines the 10 step process for implementing a midterm robotics project. The robot is designed to follow a black line, circumnavigate an obstacle when it encounters one, and stop on a green square. It will use a light sensor to follow the line, a touch sensor to detect the obstacle, and an ultrasonic sensor to navigate around the obstacle. The design was tested and refined, with the ultrasonic sensor ultimately being placed towards the front of the robot to get more accurate readings. Team members contributed to building, coding, and debugging the robot.

Uploaded by

api-296552281
Copyright
© © All Rights Reserved
Available Formats
Download as DOC, PDF, TXT or read online on Scribd
You are on page 1/ 4

Midterm Project Implementation

1. Clarify Objectives
A robot needs to be designed that will follow the black line and
proceed clockwise to the end point that is designated, which is the
green square, and stop on the green square when it reaches it. The
robot must also be able to successfully circumnavigate around the
solid object that blocks the black line at some point in the track, and
after navigating around this obstacle it needs to refind the line and
proceed to the green square.

2. Establish User Requirements


The user needs to set the robot down on the course making sure
that the light sensor (that is pointing down) is on the white space that
is somewhat close to the black line, but the user needs to set the robot
on the right side of the black line (in the white space) and then just
press the Run button. No further interaction is needed.

3. Identify Constraints
We were constrained to use one code file, the components that
the Lego kit gave us, and the coding must be done using the Brixcc
Command Center (NQC programming language).

4. Establish Functions
The robot must be able to follow the black line all throughout the
course. The robot must be able to detect the obstacle about half way
through the course, circumnavigate it a total of two times, and then
refind the black line and continue following it. The robot must be able
to detect the green square, stop on it and play a tone. The robot must
also complete this whole course at least once.

5. Establish Design Specifications

1
1
1
1

Lego light sensor


Lego ultrasonic sensor
Lego touch sensor
assembled Lego tribot

The robot must have a means to be able to follow the black line all
the way around the track by using light sensors, it must also be able to
detect the obstacle and navigate around it, and this will be done using
the touch sensor ( to detect the obstacle), and the ultrasonic sensor
( to navigate around the object). At the end the robot must also be able
to detect the green square and to stop when it reaches it; this will be
done using the light sensor again, but a different threshold range. Also,
the motors were used in order to turn the robot left, right, and proceed
forward, and built in speakers were used to play the sounds.

6. Generate Alternatives
When first starting the project most of the team thought that two
touch sensors should be used (one to detect the bump on the front of
the bot, and another on the side of the bot when navigating around the
obstacle). The reason most of the team wanted to do this is because in
hw4 most of the team used touch sensors, but that soon changed to
our current bot which uses an ultrasonic sensor. This construction was
chosen because using an ultrasonic sensor involves the robot not
having to repeatedly bump into the obstacle in order to navigate
around it, but instead can use readings to navigate around it. Another
down side to using two touch sensors is that if the ports were mixed up
when writing the code it would completely distort what the robot would
execute and not perform how it should, or how it is intended to.
When first starting out with the ultrasonic sensor it was set up
horizontally to the ground, but because the object that needed to be
avoided was cylindrical it made more sense to set the ultrasonic sensor
perpendicularly to the ground so that both points acquiring readings on
it would be receiving readings on the same plane/same part of the
obstacle.
Lastly, when the robot would detect the object and then turn (in
order to pick up readings) the ultrasonic sensor had to be
moved/placed more towards the front of the bot in order so that it
would be picking up the readings more accurately and not miss any
readings, again because the object is curved. After all these changes
were applied the robot is identical to its current state which allowed it
to run throughout the whole track successfully.

7. Model/Analyze Design
See Midterm Flowchart

Description
Tasks
main()

linefollow()

This task sets the sensors and what


ports they are in, starts off the robot
by stating for it to move forward,
starts the task (linefollow) first and
then the task (bump).
Allows the robot to follow the line by
using dynamic thresholds and
straddling the line from the right
side. It also has an exception (else
if) for when the robot encounters the
green square, and sets it to stop

wallfollow()

countimes()

bump()

when it picks up a reading in


between that range of thresholds.
Allows the robot to move around the
obstacle using the ultrasound
sensor, and also straddles it using
the threshold NEAR.
Sets up a variable and adds to the
variable a value of one every time it
passes over the black line while
going around the object. When the
variable equals five (which means it
went around the object 2 times), it
stops this function and then starts
the task linefollow again.
This task is what causes the robot to
proceed to the task wallfollow and
countimes, and it starts those tasks
when the touch sensor is activated.

Preprocessor
#NEAR 8

#TOO_DARK 26

#TOO_BRIGHT 64

#LEFT OUT_C
#RIGHT OUT_A
#EYE SENSOR_2

Sets the constant NEAR to a value of


8 which will be used in the task
wallfollow in order to know when the
robot is too close or two far from the
obstacle.
Sets the constant TOO_DARK to a
value of 26 in order to be used in the
task linefollow in order to straddle
the line and know when it is on the
black line and follow it.
Sets the constant TOO_BRIGHT to a
value of 64 in order to be used in the
task linefollow so that the robot
knows when it is on white space and
knows to refinds the line.
Sets the motor c as LEFT in order
to be used anywhere in the code
where OUT_C would be used.
Sets the motor a as RIGHT in order
to be used anywhere in the code
where OUT_A would show up.
Sets the sensor in port two to be
referred as EYE in the code instead
of SENSOR_2, in order to make it
clearer that we are using the light
sensor.

Variables
int ANDY

Initially it was set to zero, but a


value of one is added to it every
time the robot picks up the black
line when circumnavigating the
obstacle, and is later used to
indicate when to start linefollow
again (ANDY == 5).

8. Test and Evaluate Design


See Testing Table

9. Refine and Optimize Design


When we first started we had a nearly perfect design with our
ultrasonic sensor, the only thing that we changed later on was the
placement of the ultrasonic sensor, which was changed and placed
more towards the front of the robot in order to obtain more accurate
readings from the obstacle, and also the ultrasonic was placed
perpendicularly to the floor rather than horizontally in order to again
obtain more accurate readings from the ultrasonic sensor and how far
or close it was to the obstacle.
See Testing Table

10.

Document Design

See photos

Team Member Contribution


Caleb Cummings-Did almost all of the building of the robot and its
extensions (ultrasonic, touch, and light), helped out with writing, and
debugging the code.
Andrea Dumitrescu-Did all of the write up/10 step implementation, and
helped with writing the code, and debugging the code.
Grace Crowe-Helped with adding the extensions to the robot
(ultrasonic sensor and touch sensor), and helped with writing the code
and debugging it.

You might also like