Robotics
Robotics
ROBOT SENSORS
A Robot Sensor is used to measure the condition of the robot and its surrounding environment. Sensors pass electronic
signals to robots for executing desired tasks. Robots need suitable sensors that help them control themselves.
Types of Sensors
• Light sensors
• Sound sensors
• Temperature Sensor
• Touch Sensor
• Proximity Sensors
• IR Sensor
• Ultrasonic Sensor
• Pressure Sensor
• Level Sensors
• Smoke and Gas Sensors
Light Sensors:
Light sensors are used to identify the light and generate a voltage difference. There are two
types of light sensors used in robot parts-
Photoresistors and photovoltaic cells
•Photoresistors change their resistance by changing light intensities. More light on it results in
less resistance and vice versa. These are very budget-friendly and can be implemented in robots
easily.
•Photovoltaic cells can convert the energy of solar radiations into electrical energy. These are
used in manufacturing solar robots.
Sound Sensors:
This sensor is used to recognize a sound and convert it into an electrical signal. It is used in simple robots that
can navigate with the help of sound. How about a robot turning right on a single clap and left on two claps? But,
the implementation of sound sensors is not as easy as that of light sensors. The voltage difference created by a
sound is minimal and must be intensified to make a measurable change.
Temperature Sensors:
Temperature sensors are widely used in robots working in extreme weather conditions, like a
desert or an ice glacier. These sensors help robots adapt to the ambient temperature. Tiny
sensor ICs produce voltage differences to adjust to temperature changes. Temperature
sensors are used these days extensively
Contact sensors
Contact sensors require physical contact to function. This creates a trigger for the robot to act accordingly. A contact
sensor is used in a limit switch, button switch, or tactile bumper switch. They are widely used for avoiding obstacles. When
these sensor switches touch an obstacle, it commands the robot to perform tasks like turning, reversing, or simply
stopping.
The capacitive sensors are made to react to human touch. A simple example of this is the touch screen of a smartphone.
There are different types of touch sensors that are classified based on the type of touches such as capacitance touch
switch, resistance touch switch, and piezo touch switch.
Proximity Sensors:
Proximity sensors can detect the presence of an object within predefined distances without any physical contact. They utilize magnetic fields
to detect such objects. There is a wide range of proximity sensors available in the market. Let us learn about the popular ones.
• IR Transceivers:
The IR transmits a beam LED in these sensors, and the light reflects if interrupted by an obstacle. The receiver captures
this. These sensors are also used to measure distances.
• Ultrasonic:
They create sound waves of high frequency. An echo confirms the presence of an obstacle.
• Photoresistor:
Widely used as light sensors, photo resistors can also be used as proximity sensors due to their features. The amount of
light generated differs when it comes into association with an obstacle in proximity.
IR Sensor
The small photo chips having a photocell which are used to emit and detect the infrared light
are called as IR sensors. IR sensors are generally used for designing remote control
technology.
IR sensors can be used for detecting obstacles of the robotic vehicle and thus control the
direction of the robotic vehicle. There are different types of sensors that can be used for
detecting infrared lights.
Based on the commands received by the IR receiver interfaced to the microcontroller at the
receiver end. The microcontroller generates appropriate signals to drive the motors such that
to control the direction of the robotic vehicle in forward or backward or left or right.
Ultrasonic Sensor
• A transducer that works on the principle similar to the sonar or radar and estimate attributes of the
target by interpreting is called an ultrasonic sensors or transceivers.
• The high-frequency sound waves generated by active ultrasonic sensors are received back by the
ultrasonic sensor for evaluating the echo. Thus, the time interval taken for transmitting and receiving
the echo is used for determining the distance to an object. Ultrasonic sensors can be used for
measuring the distance of an object.
• The waves transmitted from the ultrasonic transmitter are reflected back to the ultrasonic receiver
from the object. The time taken for sending and receiving back these waves is calculated by using
the velocity of sound.
Distance Sensors:
Most of the proximity sensors are extensively used as distance sensors. These are also commonly
referred to as Range Sensors. The IR and ultrasonics are great assets to calculate distances
accurately.
Pressure Sensors:
They are widely used to quantify pressure.
Tactile sensor is a robot sensor that is used to measure force and pressure with the help of touch. It
is used to determine the grip strength of a robot arm and the pressure it requires to hold an object.
Temperature Sensor
• A simple temperature sensor with the circuit can be used for switching on
or off the load at a specific temperature which is detected by the
temperature sensor
• it is designed, that is used for controlling the temperature of any device
based on the requirement of industrial applications.
Classification of Robotic sensors
Uses of Sensors in Robotics
Safety Monitoring
Interlocks in work cell control
Part inspection for quality control
Determining position and related information
about objects in the robot cell
Machine Vision System:
Machine vision consists of the acquisition of image data, followed by the processing and
interpretation of these data by computer for some industrial application. Machine vision
is a growing technology, with its principal applications in automated inspection and robot
guidance.
The operation of a machine vision system can be divided into the following three
functions:
(1) image acquisition and digitization,
(2) image processing and analysis
(3) interpretation
These functions and their relationships are illustrated schematically in Figure
Image acquisition and digitization is accomplished using a digital camera and a digitizing
system to store the image data for subsequent analysis.
The camera is focused on the subject of interest, and an image is obtained by dividing
the viewing area into a matrix of discrete picture elements (called pixels), in which each
element has a value that is proportional to the light intensity of that portion of the
scene. The intensity value for each pixel is converted into its equivalent digital value by
an ADC (analog-to-digital converter,). The operation of viewing a scene consisting of a
simple object that contrasts substantially with its background, and dividing the scene
into a corresponding matrix of picture elements, is depicted in Figure
Illumination. Another important aspect of machine vision is illumination. The scene viewed by the vision
camera must be well illuminated, and the illumination must be constant over time. This almost always requires
that special lighting be installed for a machine vision application rather than relying on ambient light in the
facility. Five categories of lighting can be distinguished for machine vision applications, as depicted in Figure
22.11: (a) front lighting, (b) back lighting, (c) side lighting, (d) structured lighting, and (e) strobe lighting.
Image Processing and Analysis
The second function in the operation of a machine vision system is image processing and
analysis.
One category of techniques in image processing and analysis, called segmentation, is
intended to define and separate regions of interest within the image.
Template matching refers to various methods that attempt to compare one or more features of an
image with the corresponding features of a model or template stored in computer memory.
Feature weighting is a technique in which several features (e.g., area, length, and perimeter) are
combined into a single measure by assigning a weight to each feature according to its relative
importance in identifying the object. The score of the object in the image is compared with the score
of an ideal object residing in computer memory to achieve proper identification
Machine Vision Applications
• The reason for interpreting the image is to accomplish some application.
Machine vision applications in manufacturing divide into three categories:
(1) inspection,
(2) identification, and
(3) visual guidance and control.
Typical industrial inspection tasks include the following:
• Dimensional measurement.
• Dimensional gaging.
• Verification of the presence of components.
• Verification of geometrical features of an object (hole location and number of
holes)
• MOTION INTERPOLATION
• 1. Joint interpolation
• 3. Circular interpolation
• 1. Monitor mode
• 2. Run mode
• 3. Edit mode
• The monitor mode is used to accomplish overall supervisory control
of the system. It is sometimes referred to as the supervisory mode. In
this mode, the user can define locations in space using the teach
pendant, set the speed control for the robot, store programs, transfer
programs from storage back into control memory, or move back and
forth between the modes of operation such as edit or run.
• The run mode is used for executing a robot program. In other words,
the robot is performing the sequence of instructions in the program
during the run mode. When testing a new program in the run mode,
the user can typically employ debugging procedures built into the
language to help in developing a correct program.
• The edit mode provides an instruction set that allows the user to
write new programs or to edit existing programs. Although the
operation of the editing mode differs from one language system to
another, the kinds of editing operations that can be performed
include the writing of new lines of instructions in sequence, deleting
or making changes to existing instructions, and inserting new lines in
a program.
• VAL is an example of a robot language that is processed by an
interpreter. A compiler is a program in the operating system that
passes through the entire source program and pretranslates all of the
instructions into machine level code that can be read and executed by
the robot controller. MCL is an example of a robot language that is
processed by a compiler. Compiled programs usually result in faster
execution times. On the other hand, a source program that is
processed by an interpreter can be edited more readily since
recompilation of the entire program is not required.
• MOTION COMMANDS
• This causes the end of the arm (end effector) to move from its
present position to the point (previously defined), named A1, and so
A1 defines the position and orientation of the end effector. This
MOVE statement generally causes the arm to move with a joint-
interpolated motion. There are variations on the MOVE statement.
For example, the VAL II language provides for a straight line move
with the statement:
• MOVES A1
• This command tells the robot to move its arm to point A1, but to pass
through via point A2 in making the move.
• A related move sequence involves an approach to a point and
departure from the point. The situation arises in many material-
handling applications, in which it is necessary for the gripper to be
moved to some intermediate location above the part before
proceeding to it for the pickup. This is what is called an approach, and
the robot languages permit this motion sequence to be done in
several different ways. We will use VAL II to illustrate. Suppose the
robot's task is to pick up a part from a container. We assume that the
gripper is initially open. The following sequence might be used:
• APPRO A1, 50
• MOVES A1
• DEPART 50
• The APPRO command causes the end effector to be moved to the
vicinity of point A1, but offset from the point along the tool z axis in
the negative direction (above the part) by a distance of 50 mm. From
this location the end effector is moved straight to the point A1 and
closes its gripper around the part. The DEPART statement causes the
robot to move away from the pickup point along the tool z axis to a
distance of 50 mm. The provision is available in VAL II for the APPRO
and DEPART statements to be performed using straight line
interpolation rather than joint interpolation. These commands are
APPROS and DEPARTS, respectively.
• In addition to absolute moves to a defined point in the workspace,
incremental moves are also available to the programmer. The
following examples from AML illustrate the possibilities:
• DMOVE (1, 10)
• MOVE ARM2 ΤΟ Α1
• The robot is instructed to move its arm number 2 from the current
position to point A1.
• SPEED Control
• The SPEED command is used to define the velocity with which the
robot's arm is moved. When the SPEED command is given in the
monitor mode (preparatory to executing a program), it indicates some
absolute measure of velocity available for the robot. This might be
specified as
• SPEED 60 IPS
• which indicates that the speed of the end effector during program
execution shall be 60 in./sec unless it is altered to some other value
during the program. If no units are given, the speed command usually
indicates some value relative to the robot designer's concept of
"normal" speed. For instance,
• SPEED 75
• when applied to a gripper that has servocontrol over the width of the
finger opening would close the gripper to an opening of 40mm (
commands would control the opening of the gripper. (1.575in.).
• CLOSE 3.0 LB
• CENTER
• The CENTER statement allows the robot to center its arm around the
object rather than causing the object to be moved by the gripper
closure.
• OPERATE TOOL (SPEED = 125 RPM)
• OPERATE TOOL (TORQUE = 5 IN LB)
• OPERATE TOOL (TIME = 10 SEC)
• This would provide an output of 4.5 units (probably volts) within the
allowable range of the output signal.
• The relevant commands are as follows:
SIGNAL 5, ON Robot turns on the device
WAIT 15, ON Device signals back that it is on
.
.
.
SIGNAL 5, OFF Robot turns off the device
WAIT 15, OFF Device signals back that it is off
The WAIT statement can be used for analog signals as well as binary digital
signals in the same manner as the SIGNAL command.
• The variables could be defined as follows
DEFINE MOTOR1 = OUTPORT 5
DEFINE SENSR3= INPORT 15
• This would permit the preceding input output statements to be written in
the following way
SIGNAL MOTOR1, ON
WAIT SENSR3, ON
.
.
.
SIGNAL MOTOR1, OFF
WAIT SENSR3, OFF
It is also possible to define an analog signal, either input or output, as a
variable that is used during program execution.
• COMPUTATIONS AND OPERATIONS
• EQ Equal to
• NE Not equal to
• GT Greater than
• GE Greater than or equal to
• LT Less than
• LE Less than or equal to
• Example:
• Location variables: