Robotics Exam Pass
Robotics Exam Pass
Potentiometers
Thermocouples, thermistors.
Strain gauge
Load cell
Infrared sensors
LVDT
Pyrometers
Pizeo electric devices
Pressure Transducers
Vision and voice sensors.
There are generally two categories of sensors used in robotics; these are for internal purposes, and those for external
purposes. Internal sensors are used to monitor and control the various joints of the robot; they form a feedback control loop
with the robot controller. Examples of internal sensors include potentiometers and optical encoders, while tachometers of
various types can be deployed to control the speed of the robot arm. External sensors are external to the robot itself, and are
used when we wish to control the operations of the robotwith other pieces of equipment in the robotic work cell. External
sensors can be relatively simple devices, such as limit switches that determine whether a part has been positioned properly, or
whether a part is ready to be picked up from an unloading bay. In order to perform the task in a satisfactory way these senses
and capability includes vision and hand eye coordination, touch, hearing accordingly we will dived the types of sensors used in
robotics into the following three categories.
Robotic Sensors
A number of advanced sensor technologies may also be used; these are outlined in Table 1.
Tactile sensors
Used to determine whether contact is made between sensor and another object. Two types: touch sensors—which indicate
when contact is made, and no more; and force sensors—which indicate the magnitude of the force with the object.
Proximitysensors
Used to determine how close an object is to the sensor. Also called a range sensor.
Optical sensors Photocells and other photometric devices that are used to detect the presence or absence of objects. Often
used in conjunction to proximity sensors.
Machinevision
Used in robotics for inspection, parts identification, guidance, and other uses.
Others Miscellaneous category of sensors may also be used; including devices for measuring: temperature, fluid pressure, fluid
flow, electrical voltage, current, and other physical properties.
2.7.2 Range sensor:
Ranging sensors include sensors that require no physical contact with the object being detected. They allow a robot to see an
obstacle without actually having to come into contact with it. This can prevent possible entanglement, allow for better obstacle
avoidance (over touch- feedback methods), and possibly allow software to distinguish between obstacles of different shapes
and sizes. There are several methods used to allow a sensor to detect obstacles from a distance. Light-based ranging sensors
use multiple methods for detecting obstacles and determining range. The simplest method uses the intensity of the reflected
light from an obstacle to estimate distance. However, this can be significantly affected by the color/reflectivity of the obstacle
and external light sources. A more common method is to use a beam of light projectedat an angle and a strip of detectors
spaced away from the emitter as in the animation to the right. The pictured Sharp sensor uses this method. This method is less
affected by the color/reflectivity of the object and ambient light.
LIDAR, a more advanced method of range detection, uses a laser that is swept across the sensor's field of view. The reflected
laser light is usually analyzed one of two ways. Units with longer ranges sometimes actually determine distance by measuring
the time it takes for the laser pulse to return to the sensor. This requires extremely fast timing circuitry. Another method uses
phase shift detection to determine range by analyzing the incoming light and comparing it toa reference signal.
Workingprinciple –
Triangulation: Use the triangle formed by the travelling path of the signal to calculate the distance
Time-of-flight: Use the time of flight of the signals to measure the distance
Typical range sensors – Infra-red range sensor (triangulation) – Ultrasonic sensors (time-of-flight) – Laser range sensor
(triangulation)
Ultrasonic Sensors
Emit a quick burst of ultrasound (50kHz), (human hearing: 20Hz to 20kHz) – Measure the elapsed time until the receiver
indicates that an echo is detected. – Determine how far away the nearest object is from the sensor
D=v*t
D = round-trip distance
v = speed of propagation (340 m/s)
t = elapsed time
Applications: – Distance Measurement – Mapping: Rotating proximity scans (maps the proximity of objects surrounding the
robot)
Limitations of Ultrasonic Sensors
• Background noises: If there are other ultrasonic sources, the sensor may detect signals emitted by another source.
• The speed of sound varies with air temperature and pressure – a 16 degree centigrade temperature change can cause a 30cm
error at 10m!
• Cross-talk problem: If a robot has more than one ultrasonic sensor, measurement ranges intersect, a sensor may receive
signals emitted by others.
• Poor surface reflection: Surface materials absorb ultrasonic waves.
• Surface orientation affects the reflection of ultrasonic signals.
2.7.3 Tactile sensor
Tactile sensors provide the robot with the capability to respond to contact forces between itself and other objects within its
work volume. Tactile sensors can be divided into
twotypes:
1. Touch sensors 2. Stress sensors
Touch sensors are used simply to indicate whether contact has been made with an object. A simple micro switch can serve the
purpose of a touch sensor. Stress sensors are used to measure the magnitude of the contact force. Strain gauge devices are
typically employed in force measuring sensors. use of robots with tactile sensing capabilities
Trajectoryplanning
for robots involves determining the path and control commands that a robot will follow to move from its initial position to a
target position, while considering time, velocity, acceleration, and dynamics constraints. The goal is to create a smooth,
efficient, and feasible trajectory that ensures the robot’s safety, precision, and optimal performance. Various methods are used
for trajectory planning, including polynomial interpolation (e.g., cubic splines), piecewise linear paths, and velocity profiles like
trapezoidal or cubic velocity profiles. These profiles define the motion parameters (position, velocity, acceleration) over time,
ensuring smooth transitions and minimizing jerk. Constraints such as workspace boundaries, joint limits, and environmental
factors are incorporated to avoid collisions and maintain stability. Trajectory planning is essential for applications in industrial
automation, robotics, and autonomous vehicles, where precision and time efficiency are critical.
shrink and swell operators are fundamental morphological operations used in robotic vision and image processing to
manipulate and analyze object shapes in binary images. These operators are based on set theory and work by applying
structuring elements to an image.
Shrink (Erosion): The shrink operator, or erosion, reduces the size of bright regions (foreground) in a binary image. It
erodes the boundaries of objects, making them smaller by removing pixels at the object’s boundary. This operation is
useful for eliminating small noise or gaps between objects.
Swell (Dilation): The swell operator, or dilation, increases the size of bright regions by adding pixels to the boundaries
of objects. It can be used to connect nearby objects, fill holes, or expand object regions for better visibility.
Both operators are applied iteratively to achieve desired effects. When used together, shrink and swell can refine object
detection, enhance feature extraction, and improve the robustness of robotic vision systems in tasks like object recognition,
segmentation, and path planning.
Implementation of Industry 4.0 and Industrial Robots in Manufacturing Processes
The fourth industrial revolution, known as Industry 4.0, is transforming manufacturing processes through the integration of
advanced technologies. Emerging in 2011, Industry 4.0 encompasses robotics, automation, the Internet of Things (IoT), 3D
printing, smart sensors, and Radio Frequency Identification (RFID). These technologies enable intelligent automation, self-
configuration, self-diagnosis, and problem-solving capabilities, revolutionizing traditional manufacturing.
Historically, industrial revolutions have progressed from steam engines to electricity, IT systems, and now to cyber-physical
systems (CPS) and cloud computing. Industry 4.0 aims to create a flexible, interconnected production environment where
virtual and physical systems collaborate seamlessly. Key principles include interoperability, decentralization, virtualization, real-
time capability, service orientation, and modularity. These principles facilitate the integration of CPS, people, and production
processes, enabling real-time monitoring and decision-making.
The transition to Industry 4.0 involves significant challenges, such as changing business paradigms, legal issues, resource
planning, security concerns, and standardization. Successful implementation requires collaboration among all stakeholders,
including manufacturers, managers, designers, and end-users. The document emphasizes the importance of innovation,
efficiency, agility, and risk management in implementing Industry 4.0.
Industry 4.0 transforms traditional manufacturing into smart factories, enhancing productivity, quality, and competitiveness. It
underscores the need for a multidisciplinary approach and continuous education to adapt to the rapidly evolving technological
landscape. The implementation strategy of Industry 4.0 is to enable the adjustment of industrial production to complete
intelligent automation, which means introducing self-automation, self-configuration, self-diagnosis, problem-solving, and
intelligent decision-making.
In conclusion, Industry 4.0 is already present in all industrial branches, from production to the sale of finished products. By
introducing technologies such as Cloud Computing, Robotics & Automation, Intelligent Sensors, 3D printers, and RFID, we
witness changes in processes and technologies as well as the organization of manufacturing and sales. The complete
application of Industry 4.0 leads us towards smart factories, allowing companies to remain competitive on the global market.
Smart factories are not isolated from other social changes such as the development of businesses, science, and education,
which require changes in all segments because world knowledge is doubling every one or two years. The success or failure to
implement Industry 4.0 lies in the hands of all participants in the chain, from the manufacturer to the end user.