0% found this document useful (0 votes)
96 views37 pages

Haptic Technology: A Technical Seminar ON

The document discusses Sunka Sai Kumar's technical seminar on haptic technology submitted in partial fulfillment of the requirements for a Bachelor of Technology degree in computer science and engineering. The seminar is presented under the guidance of Mrs. Y. Shivasree and discusses key topics such as the introduction to haptic devices including Phantom and Cyber Grasp, applications of haptic technology in surgical simulation and medical training, haptic rendering and contact detection, interfacing devices like Phantom and Cyber Glove, applications in graphical user interfaces and medical training, and limitations of haptic technology.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
96 views37 pages

Haptic Technology: A Technical Seminar ON

The document discusses Sunka Sai Kumar's technical seminar on haptic technology submitted in partial fulfillment of the requirements for a Bachelor of Technology degree in computer science and engineering. The seminar is presented under the guidance of Mrs. Y. Shivasree and discusses key topics such as the introduction to haptic devices including Phantom and Cyber Grasp, applications of haptic technology in surgical simulation and medical training, haptic rendering and contact detection, interfacing devices like Phantom and Cyber Glove, applications in graphical user interfaces and medical training, and limitations of haptic technology.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 37

A

TECHNICAL SEMINAR

ON

HAPTIC TECHNOLOGY

Submitted in Partial fulfillment of the requirements for the award of the degree of

BACHELOR OF TECHNOLOGY

IN

COMPUTER SCIENCE AND ENGINEERING

Submitted By

SUNKA SAI KUMAR

(16R91A0557)

Under the esteemed Guidance of

Mrs. Y. SHIVASREE

ASST. PROFESSOR

Department of Computer Science and Engineering

TEEGALA KRISHNA REDDY ENGINEERING COLLEGE


(Affiliated to JNTUH, Hyderabad, Approved by AICTE, Accredited by NBA)
Medbowli, Meerpet, Saroornagar, Hyderabad – 500097.

2016-2020
CERTIFICATE
This is to certify that the technical seminar entitled “HAPTIC
TECHNOLOGY” submitted by Mr. S.SAI KUMAR, bearing Roll No.
16R91A0557 in partial fulfilment of the requirements for the award of the
degree of Bachelor of Technology in Computer Science and Engineering is a
record of bonafide work carried out by her under my guidance. The result of
investigation enclosed in this report have been verified and found satisfactory.
The result embodied in this thesis has not been submitted to any other
University or Institute for the award of any other Degree or Diploma.

Internal Guide Head of Department

A. YADAGIRI Dr. Ch. V.PHANI KRISHNA

Asst. Professor Professor

……………………………….. ……………………………..

Principal

Dr. K. M. V. MADAN KUMAR

…………………………
ACKNOWLEDGEMENT

The satisfaction and euphoria that accompanies the successful completion of


any task would be incomplete without the mention of the people who made it
possible and whose encouragement and guidance have crowned our efforts with
success.

I extend my deep sense of gratitude Dr. K. M. V Madan Kumar Garu,


to Principal, Teegala Krishna Reddy College, Meerpet, for permitting me to
Engineering undertake this mini project.

I am also indebted to Dr. Ch. V. Phani krishna, Professor & Head of the
Department, Computer Science Engineering, Teegala Krishna Reddy Engineering
College, Meerpet for her support and guidance throughout my mini project.

I am indebted to my guide Y.SHIVA SHREE, ASST.Professor, Computer


Science Engineering, Teegala Krishna Reddy Engineering College, Meerpet for her
support and guidance throughout my mini project.

Finally, I express thanks to one and all that have helped me in successfully
completing this major project. Further I would like to thank my family and friends
for their moral support and encouragement.

Submitted By:

S.SAI KUMAR

(16R91A0557)
DECLARATION BY THE CANDIDATE

I SUNKA SAI KUMAR bearing Roll no 16R91A0557, hereby declare that the
technical seminar entitled “HAPTIC TECHNOLOGY” is done under the guidance
of Mrs. Y. Shivasree, Assistant Professor, Department of Computer Science and
Engineering.

This is a record of bonafide work carried out by me in Teegala Krishna Reddy


Engineering College and the results embodied in this seminar have not been
reproduced or copied from any source. The results embodied in this seminar report
have not been submitted to any other university or institute for the award of any other
degree or diploma.

Submitted by:

S. SAI KUMAR(16R91A0557)
CONTENT

S. No Name of the topic Pg.No

LIST OF FIGURES I

ABSTRACT II

1. INTRODUCTION 1

1.1 Haptic Devices 1

1.2. Phantom 2

1.3. Cyber Grasp 2

1.4. Haptic Rendering 3

1.5. Contact Detection 3

2. APPLICATIONS OF HAPTIC TECHNOLOGY 4

3. SURGICAL AND MEDICAL TRAINING 4

4. DEFORMABLE OBJECTS 9

4.1 What is ‘Haptics’ 9

4.1.1 History Of Haptics 11

4.2 Basic system configuration 12

5. HAPTIC DEVICES 14

5.1 Virtual reality/ Telerobotics based devices 14

5.1.1Exoskeletons and Stationary devices 14

5.1.2 Gloves and wearable devices 14

5.1.3 Point sources and specific task devices 14

5.2 Feedback device 15

5.2.1Force feedback device 15


6. INTERFACING DEVICES 16

6.1 Phantom 16

6.2 Cyber glove 17

6.3 Principle of haptic interface. 18

6.4 System architecture for haptic rendering 18

7. APPLICATIONS 21

7.1 Graphical user interfaces. 22

7.2 Surgical Simulation and Medical Training. 22

8. LIMITATIONS 25

9. CONCLUSION 28

10. REFERENCES 29
LIST OF FIGURES

Figure No. Figure Name Pg.No

1.1.1 Phantom 2

1.1.2 Cyber Grasp 2

3.1.1 Human Haptic 5

3.1.2 Articulated Tools 6

3.1.3 Laparoscopic 7

4.2.1 Virtual Reality 13

6.1.1 Contact Display Design 16

6.2.1 Cyberglove 17

6.3.1 Haptic Interface 18

6.4.1 System Architecture 18

7.1.1 Graphical User Interface 21

7.2.1 Medical Training 22

7.3.1 Military Training 23

I
ABSTRACT
Engineering as it finds its wide range of application in every field not an
exception even the medical field. One of the technologies which aid the surgeons to
perform even the most complicated surgeries successfully is Virtual Reality.

Even though virtual reality is employed to carry out operations the surgeon’s
attention is one of the most important parameter. If he commits any mistakes it may
lead to a dangerous end. So, one may think of a technology that reduces the burdens
of a surgeon by providing an efficient interaction to the surgeon than VR. Now our
dream came to reality by means of a technology called “HAPTIC
TECHNOLOGY”.

Haptic is the “science of applying tactile sensation to human interaction


with computers”. In our paper we have discussed the basic concepts behind haptic
along with the haptic devices and how these devices are interacted to produce sense of
touch and force feedback mechanisms. Also the implementation of this mechanism by
means of haptic rendering and contact detection were discussed.

We mainly focus on ‘Application of Haptic Technology in Surgical


Simulation and Medical Training’. Further we explained the storage and retrieval of
haptic data while working with haptic devices. Also the necessity of haptic data
compression is illustrated.

II
HAPTIC TECHNOLOGY

1.INTRODUCTION
Haptic, is the term derived from the Greek word, haptesthai, which means ‘to
touch’. Haptic is defined as the “science of applying tactile sensation to human
interaction with computers”. It enables a manual interaction with real, virtual and
remote environment. Haptic permits users to sense (“feel”) and manipulate three-
dimensional virtual objects with respect to such features as shape, weight, surface
textures, and temperature.A Haptic Device is one that involves physical contact
between the computer and the user. By using Haptic devices, the user can not only
feed information to the computer but can receive information from the computer in
the form of a felt sensation on some part of the body. This is referred to as a Haptic
interface.

In our paper we explain the basic concepts of ‘Haptic Technology and its
Application in Surgical Simulation and Medical Training’.

1.1 Haptic Devices:

Force feedback is the area of haptics that deals with devices that interact
with the muscles and tendons that give the human a sensation of a force being
applied—hardware and software that stimulates humans’ sense of touch and feel
through tactile vibrations or force feedback. These devices mainly consist of robotic
manipulators that push back against a user with the forces that correspond to the
environment that the virtual effector’s is in. Tactile feedback makes use of devices
that interact with the nerve endings in the skin to indicate heat, pressure, and
texture. These devices typically have been used to indicate whether or not the user is
in contact with a virtual object. Other tactile feedback devices have been used to
stimulate the texture of a virtual object. Phantom and Cyber Grasp are some of the
examples of Haptic Devices.

DEPARTMENT OF CSE 1 TKREC


HAPTIC TECHNOLOGY

1.2 Phantom:

A small robot arm with three revolute joints each connected to a


computer-controlled electric DC motor. The tip of the device is attached to a stylus
that is held by the user. By sending appropriate voltages to the motors, it is possible to
exert up to 1.5 pounds of force at the tip of the stylus, in any direction.

Fig:1.1.1 Phantom

1.3 Cyber Grasp:

The CyberGlove is a lightweight glove with flexible sensors that accurately


measure the position and movement of the fingers and wrist. The CyberGrasp, from
Immersion Corporation, is an exoskeleton device that fits over a 22 DOF CyberGlove,
providing force feedback. The CyberGrasp is used in conjunction with a position
tracker to measure the position and orientation of the fore arm in three-dimensional
space.

Fig:1.1.2 Cyber Grasp

DEPARTMENT OF CSE 2 TKREC


HAPTIC TECHNOLOGY

1.4 Haptic Rendering:

It is a process of applying forces to the user through a force-feedback device.


Using haptic rendering, we can enable a user to touch, feel and manipulate virtual
objects. Enhance a user’s experience in virtual environment. Haptic rendering
is process of displaying synthetically generated 2D/3D haptic stimuli to the
user. The haptic interface acts as a two-port system terminated on one side by the
human operator and on the other side by the virtual environment.

1.5Contact Detection

A fundamental problem in haptics is to detect contact between the virtual


objects and the haptic device (a Phantom, a glove, etc.). Once this contact is reliably
detected, a force corresponding to the interaction physics is generated and rendered
using the probe. This process usually runs in a tight servo loop within a haptic
rendering system.Another technique for contact detection is to generate the surface
contact point (SCP), which is the closest point on the surface to the actual tip of the
probe. The force generation can then happen as though the probe were physically at
this location rather than within the object. Existing methods in the literature generate
the SCP by using the notion of a god-object, which forces the SCP to lie on the
surface of the virtual object.

DEPARTMENT OF CSE 3 TKREC


HAPTIC TECHNOLOGY

2. APPLICATIONS OF HAPTIC TECHNOLOGY

Haptic Technology as it finds it wide range of Applications some among them


were mentioned below:

1. Surgical simulation & Medical training.

2. Physical rehabilitation.

3. Training and education.

4. Museum display.

5. Painting, sculpting and CAD

6. Scientific Visualization.

7. Military application.

8. Entertainment.

The role of Haptic Technology in “Surgical Simulation and Medical


Training” is discussed in detail below.

DEPARTMENT OF CSE 4 TKREC


HAPTIC TECHNOLOGY

3. SURGICAL AND MEDICAL TRAINING:

Haptic is usually classified as:-

Human haptics: human touch perception and manipulation.

Machine haptics: concerned with robot arms and hands.

Computer haptics: concerned with computer mediated.

Fig:3.1.1 Human Haptic

A primary application area for haptics has been in surgical simulation and
medical training. Haptic rendering algorithms detect collisions between surgical
instruments and virtual organs and render organ-force responses to users through
haptic interface devices. For the purpose of haptic rendering, we’ve conceptually
divided minimally invasive surgical tools into two generic groups based on their
functions.

1. Long, thin, straight probes for palpating or puncturing the tissue and for injection
(puncture and injection needles and palpation probes)

2. Articulated tools for pulling, clamping, gripping, and cutting soft tissues (such as
biopsy and punch forceps, hook scissors, and grasping forceps).A 3D computer model

DEPARTMENT OF CSE 5 TKREC


HAPTIC TECHNOLOGY

of an instrument from each group (a probe from the first group and a forceps from the
second) and their behavior in a virtual environment is shown.

During real-time simulations, the 3D surface models of the probe and forceps is used
to provide the user with realistic visual cues. For the purposes of haptic rendering of
tool–tissue interactions, a ray-based rendering, in which the probe and forceps are
modeled as connected line segments. ‘Modeling haptic interactions between a probe
and objects using this line-object collision detection and response has several
advantages over existing point based techniques, in which only the tip point of a
haptic device is considered for touch interactions’.

Fig:3.1.2 Articulated tools

Grouping of surgical instruments for simulating tool–tissue interactions.

Group A includes long, thin, straight probes.

Group B includes tools for pulling, clamping, and cutting soft tissue.

Users feel torques if a proper haptic device is used. For example, the user can
feel the coupling moments generated by the contact forces at the instrument tip and
forces at the trocar pivot point.

DEPARTMENT OF CSE 6 TKREC


HAPTIC TECHNOLOGY

Users can detect side collisions between the simulated tool and 3D models of
organs.

Users can feel multiple layers of tissue if the ray representing the simulated
surgical probe is virtually extended to detect collisions with an organ’s internal layers.
This is especially useful because soft tissues are typically layered, each layer has
different material properties, and the forces/torques reflected to the user depends on
the laparoscopic tool’s orientation.

Users can touch and feel multiple objects simultaneously. Because


laparoscopic instruments are typically long slender structures and interact with
multiple objects (organs, blood vessels, surrounding tissue, and so on) during a MIS
(Minimally Invasive Surgery), ray-based rendering provides a more natural way than
a purely point-based rendering of tool-tissue interactions. To simulate haptic
interactions between surgical material held by a laparoscopic tool (for example, a
catheter, needle, or suture) and a deformable body (such as an organ or vessel), a
combination of point- and ray-based haptic rendering methods are used.

Fig:3.1.3 Laproscopic

In case of a catheter insertion task shown above, the surgical tools using line
segments and the catheter using a set of points uniformly distributed along the
catheter’s center line and connected with springs and dampers. Using our point based
haptic rendering method; the collisions between the flexible catheter and the inner
surface of a flexible vessel are detected to compute interaction forces.

The concept of distributed particles can be used in haptic rendering of organ–


organ interactions (whereas a single point is insufficient for simulating organ–organ
interactions, a group of points, distributed around the contact region, can be used)

DEPARTMENT OF CSE 7 TKREC


HAPTIC TECHNOLOGY

and other minimally invasive procedures, such as bronchoscope and colonoscopy,


involving the insertion of a flexible material into a tubular body .

DEPARTMENT OF CSE 8 TKREC


HAPTIC TECHNOLOGY

4.DEFORMABLE OBJECTS

One of the most important components of computer based surgical simulation


and training systems is the development of realistic organ-force models. A good
organ-force model must reflect stable forces to a user, display smooth deformations,
handle various boundary conditions and constraints, and show physics-based realistic
behavior in real time. Although the computer graphics community has developed
sophisticated models for real-time simulation of deformable objects, integrating tissue
properties into these models has been difficult. Developing real-time and realistic
organ-force models is challenging because of viscoelasticity, anisotropy, nonlinearity,
rate, and time dependence in material properties of organs. In addition, soft organ
tissues are layered and nonhomogeneous.

Tool–tissue interactions generate dynamical effects and cause nonlinear


contact interactions of one organ with the others, which are quite difficult to simulate
in real time. Furthermore, simulating surgical operations such as cutting and
coagulation requires frequently updating the organ geometric database and can cause
force singularities in the physics-based model at the boundaries. There are currently
two main approaches for developing force-reflecting organ models:

1. Particle-based methods.

2. Finite-element methods (FEM).

In particle-based models, an organ’s nodes are connected to each other with


springs and dampers. Each node (or particle) is represented by its own position,
velocity, and acceleration and moves under the influence of forces applied by the
surgical instrument.

In finite-element modeling, the geometric model of an organ is divided into


surface or volumetric elements, properties of each element are formulated, and the
elements are assembled together to compute the deformation states of the organ for
the forces applied by the surgical instruments.

DEPARTMENT OF CSE 9 TKREC


HAPTIC TECHNOLOGY

Capture, Storage, and Retrieval of Haptic Data:

The newest area in haptic is the search for optimal methods for the description,
storage, and retrieval of moving-sensor data of the type generated by haptic devices.
This techniques captures the hand or finger movement of an expert performing a
skilled movement and “play it back,” so that a novice can retrace the expert’s path,
with realistic touch sensation; The INSITE system is capable of providing
instantaneous comparison of two users with respect to duration, speed, acceleration,
and thumb and finger forces.Techniques for recording and playing back raw haptic
data have been developed for the PHANToM and CyberGrasp. Captured data include
movement in three dimensions, orientation, and force (contact between the probe and
objects in the virtual environment).

Haptic Data Compression:

Haptic data compression and evaluation of the perceptual impact of lossy


compression of haptic data are further examples of uncharted waters in haptics
research.

Data about the user's interaction with objects in the virtual environment must
be continually refreshed if they are manipulated or deformed by user input. If data are
too bulky relative to available bandwidth and computational resources, there will be
improper registration between what the user sees on screen and what he “feels.”

4.1 What is ‘Haptics’?

Haptic technology refers to technology that interfaces the user with a virtual
environment via the sense of touch by applying forces, vibrations, and/or motions to
the user. This mechanical stimulation may be used to assist in the creation of virtual
objects (objects existing only in a computer simulation), for control of such virtual
objects, and to enhance the remote control of machines and devices (teleoperators).

DEPARTMENT OF CSE 10 TKREC


HAPTIC TECHNOLOGY

This emerging technology promises to have wide-reaching applications as it


already has in some fields.For example, haptic technology has made it possible to
investigate in detail how the human sense of touch works by allowing the creation of
carefully controlled haptic virtual objects. These objects are used to systematically
probe human haptic capabilities, which would otherwise be difficult to achieve. These
new research tools contribute to our understanding of how touch and its underlying
brain functions work. Although haptic devices are capable of measuring bulk or
reactive forces that are applied by the user, it should not to be confused with touch or
tactile sensors that measure the pressure or force exerted by the user to the interface.

The term haptic originated from the Greek word ἁπτικός (haptikos), meaning
pertaining to the sense of touch and comes from the Greek verb ἅπτεσθαι (haptesthai)
meaning to “contact” or “touch”.

4.1.1 History Of Haptics

In the early 20th century, psychophysicists introduced the word haptics to


label the subfield of their studies that addressed human touch-based perception and
manipulation. In the 1970s and 1980s, significant research efforts in a completely
different field, robotics also began to focus on manipulation and perception by touch.
Initially concerned with building autonomous robots, researchers soon found that
building a dexterous robotic hand was much more complex and subtle than their
initial naive hopes had suggested.

In time these two communities, one that sought to understand the human hand
and one that aspired to create devices with dexterity inspired by human abilities found
fertile mutual interest in topics such as sensory design and processing, grasp control
and manipulation, object representation and haptic information encoding, and
grammars for describing physical tasks.

In the early 1990s a new usage of the word haptics began to emerge. The
confluence of several emerging technologies made virtualized haptics, or computer
haptics possible. Much like computer graphics, computer haptics enables the display
of simulated objects to humans in an interactive manner. However, computer haptics
uses a display technology through which objects can be physically palpated.

DEPARTMENT OF CSE 11 TKREC


HAPTIC TECHNOLOGY

WORKING OF HAPTIC SYSTEMS

4.2 Basic system configuration.

Basically a haptic system consist of two parts namely the human part and the
machine part. In the figure shown above, the human part (left) senses and controls the
position of the hand, while the machine part (right) exerts forces from the hand to
simulate contact with a virtual object. Also both the systems will be provided with
necessary sensors, processors and actuators. In the case of the human system, nerve
receptors performs sensing, brain performs processing and muscles performs
actuation of the motion performed by the hand while in the case of the machine
system, the above mentioned functions are performed by the encoders, computer and
motors respectively.

Haptic Information

Basically the haptic information provided by the system will be the


combination of (i) Tactile information and (ii) Kinesthetic information.

Tactile information refers the information acquired by the sensors which are
actually connected to the skin of the human body with a particular reference to the
spatial distribution of pressure, or more generally, tractions, across the contact area.

For example when we handle flexible materials like fabric and paper, we sense
the pressure variation across the fingertip. This is actually a sort of tactile information.
Tactile sensing is also the basis of complex perceptual tasks like medical palpation,
where physicians locate hidden anatomical structures and evaluate tissue properties
using their hands.

Creation of Virtual environment (Virtual reality).

Virtual reality is the technology which allows a user to interact with a


computer-simulated environment, whether that environment is a simulation of the real
world or an imaginary world. Most current virtual reality environments are primarily
visual experiences, displayed either on a computer screen or through special or

DEPARTMENT OF CSE 12 TKREC


HAPTIC TECHNOLOGY

stereoscopic displays, but some simulations include additional sensory information,


such as sound through speakers or headphones.

Some advanced, haptic systems now include tactile information, generally


known as force feedback, in medical and gaming applications. Users can interact with
a virtual environment or a virtual artifact (VA) either through the use of standard
input devices such as a keyboard and mouse, or through multimodal devices such as a
wired glove, the Polhemus boom arm, and omnidirectional treadmill. The simulated
environment can be similar to the real world, for example, simulations for pilot or
combat training, or it can differ significantly from reality, as in VR games. In practice,
it is currently very difficult to create a high-fidelity virtual reality experience, due
largely to technical limitations on processing power, image resolution and
communication bandwidth. However, those limitations are expected to eventually be
overcome as processor, imaging and data communication technologies become more
powerful and cost-effective over time.

Fig:4.2.1 Virtual Reality

Virtual Reality is often used to describe a wide variety of applications, commonly


associated with its immersive, highly visual, 3D environments. The development of
CAD software, graphics hardware acceleration, head mounted displays; database
gloves and miniaturization have helped popularize the motion. These flight simulators
have designed just like cockpit of the airplanes or the helicopter. The screen in front
of the pilot creates virtual environment and the trainers outside the simulators
commands the simulator for adopt different modes.

DEPARTMENT OF CSE 13 TKREC


HAPTIC TECHNOLOGY

5.HAPTIC DEVICES

A haptic device is the one that provides a physical interface between the user
and the virtual environment by means of a computer. This can be done through an
input/output device that senses the body’s movement, such as joystick or data glove.
By using haptic devices, the user can not only feed information to the computer but
can also receive information from the computer in the form of a felt sensation on
some part of the body. This is referred to as a haptic interface.

Haptic devices can be broadly classified into

5.1 Virtual reality/ Telerobotics based devices

5.1.1 Exoskeletons and Stationary devices

The term exoskeleton refers to the hard outer shell that exists on many
creatures. In a technical sense, the word refers to a system that covers the user or the
user has to wear. Current haptic devices that are classified as exoskeletons are large
and immobile systems that the user must attach him- or herself to.

5.1.2 Gloves and wearable devices

These devices are smaller exoskeleton-like devices that are often, but not
always, take the down by a large exoskeleton or other immobile devices. Since the
goal of building a haptic system is to be able to immerse a user in the virtual or
remote environment and it is important to provide a small remainder of the user’s
actual environment as possible. The drawback of the wearable systems is that since
weight and size of the devices are a concern, the systems will have more limited sets
of capabilities.

5.1.3 Point sources and specific task devices

This is a class of devices that are very specialized for performing a particular
given task. Designing a device to perform a single type of task restricts the application
of that device to a much smaller number of functions. However it allows the designer
to focus the device to perform its task extremely well.

DEPARTMENT OF CSE 14 TKREC


HAPTIC TECHNOLOGY

These task devices have two general forms, single point of interface devices
and specific task devices.

5.2 Feedback device

5.2.1 Force feedback devices

Force feedback input devices are usually, but not exclusively, connected to
computer systems and is designed to apply forces to simulate the sensation of weight
and resistance in order to provide information to the user. As such, the feedback
hardware represents a more sophisticated form of input/output devices,
complementing others such as keyboards, mice or trackers. Input from the user in the
form of hand, or other body segment whereas feedback from the computer or other
device is in the form of hand, or other body segment whereas feedback from the
computer or other device is in the form of force or position. These devices translate
digital information into physical sensations.

DEPARTMENT OF CSE 15 TKREC


HAPTIC TECHNOLOGY

6. INTERFACING DEVICES

6.1 PHANTOM

It is a haptic interfacing device developed by a company named Sensable


technologies. It is primarily used for providing a 3D touch to the virtual objects. This
is a very high resolution 6 DOF device in which the user holds the end of a motor
controlled jointed arm. It provides a programmable sense of touch that allows the user
to feel the texture and shape of the virtual object with a very high degree of realism.
One of its key features is that it can model free floating 3 dimensional objects.

Fig:6.1.1 Contact Display Design

Figure above shows the contact display design of a Phantom device. Here
when the user puts one of his finger in the thimble connected to the metal arm of the
phantom device and when the user move his finger, then he could really feel the shape
and size of the virtual 3 dimensional object that has been already programmed inside
the computer. The virtual 3 dimensional space in which the phantom operates is called
haptic scene which will be a collection of separate haptic objects with different
behaviors and properties. The dc motor assembly is mainly used for converting the
movement of the finger into a corresponding virtual movement.

DEPARTMENT OF CSE 16 TKREC


HAPTIC TECHNOLOGY

6.2 Cyberglove

Fig:6.2.1 Cyberglove

The principle of a Cyberglove is simple. It consists of opposing the movement


of the hand in the same way that an object squeezed between the fingers resists the
movement of the latter. The glove must therefore be capable, in the absence of a real
object, of recreating the forces applied by the object on the human hand with (1) the
same intensity and (2) the same direction. These two conditions can be simplified by
requiring the glove to apply a torque equal to the interphalangian joint.

The solution that we have chosen uses a mechanical structure with three
passive joints which, with the interphalangian joint, make up a flat four-bar closed-
link mechanism. This solution use cables placed at the interior of the four-bar
mechanism and following a trajectory identical to that used by the extensor tendons
which, by nature, oppose the movement of the flexor tendons in order to harmonize
the movement of the fingers. Among the advantages of this structure one can cite:

1. Allows 4 dof for each finger

2. Adapted to different size of the fingers

3. Located on the back of the hand

Measure finger angular flexion (The measure of the joint angles are
independent and can have a good resolution given the important paths traveled by
the cables when the finger shut.HAPTIC RENDERING

DEPARTMENT OF CSE 17 TKREC


HAPTIC TECHNOLOGY

6.3 Principle of haptic interface

Fig:6.3.1 Haptic interface

As illustrated in Fig. given above, haptic interaction occurs at an interaction


tool of a haptic interface that mechanically couples two controlled dynamical systems:
the haptic interface with a computer and the human user with a central nervous
system. The two systems are exactly symmetrical in structure and information and
they sense the environments, make decisions about control actions, and provide
mechanical energies to the interaction tool through motions.

6.4 System architecture for haptic rendering

Fig:6.4.1 System architecture

DEPARTMENT OF CSE 18 TKREC


HAPTIC TECHNOLOGY

Haptic-rendering algorithms compute the correct interaction forces between


the haptic interface representation inside the virtual environment and the virtual
objects populating the environment. Moreover, haptic rendering algorithms ensure
that the haptic device correctly renders such forces on the human operator. Several
components compose typical haptic rendering algorithms. We identify three main
blocks, illustrated in Figure shown above.

Collision-detection algorithms detect collisions between objects and avatars in


the virtual environment and yield information about where, when, and ideally to what
extent collisions (penetrations, indentations, contact area, and so on) have occurred.

Force-response algorithms compute the interaction force between avatars and


virtual objects when a collision is detected. This force approximates as closely as
possible the contact forces that would normally arise during contact between real
objects. Force-response algorithms typically operate on the avatars’ positions, the
positions of all objects in the virtual environment, and the collision state between
avatars and virtual objects. Their return values are normally force and torque vectors
that are applied at the device-body interface. Hardware limitations prevent haptic
devices from applying the exact force computed by the force-response algorithms to
the user.

Control algorithms command the haptic device in such a way that minimizes
the error between ideal and applicable forces. The discrete-time nature of the haptic-
rendering algorithms often makes this difficult; as we explain further later in the
article. Desired force and torque vectors computed by force response algorithms feed
the control algorithms. The algorithms’ return values are the actual force and torque
vectors that will be commanded to the haptic device.

A typical haptic loop consists of the following sequence of events:

1) Low-level control algorithms sample the position sensor sat the haptic
interface device joints.

2) These control algorithms combine the information collected from each


sensor to obtain the position of the device-body interface in Cartesian space—that is,
the avatar’s position inside the virtual environment.
DEPARTMENT OF CSE 19 TKREC
HAPTIC TECHNOLOGY

3) The collision-detection algorithm uses position information to find


collisions between objects and avatars and report the resulting degree of penetration.

4) The force-response algorithm computes interaction forces between avatars


and virtual objects involved in a collision.

5) The force-response algorithm sends interaction forces to the control


algorithms, which apply them on the operator through the haptic device while
maintaining a stable overall behavior.

The simulation engine then uses the same interaction forces to compute their effect on
objects in the virtual environment. Although there are no firm rules about how frequently the
algorithms must repeat these computations, a 1-KHz servo rate is common. This rate seems to
be a subjectively acceptable compromise permitting presentation of reasonably complex
objects with reasonable stiffness.

DEPARTMENT OF CSE 20 TKREC


HAPTIC TECHNOLOGY

7. APPLICATIONS

The following are the major applications of haptic systems.

7.1 Graphical user interfaces.

Video game makers have been early adopters of passive haptics, which takes
advantage of vibrating joysticks, controllers and steering wheels to reinforce on-
screen activity. But future video games will enable players to feel and manipulate
virtual solids, fluids, tools and avatars. The Novint Falcon haptics controller is already
making this promise a reality. The 3-D force feedback controller allows you to tell the
difference between a pistol report and a shotgun blast, or to feel the resistance of a
longbow's string as you pull back an arrow.

Fig:7.1.1Graphical user interface

Graphical user interfaces, like those that define Windows and Mac operating
environments, will also benefit greatly from haptic interactions. Imagine being able to
feel graphic buttons and receive force feedback as you depress a button. Some
touchscreen manufacturers are already experimenting with this technology. Nokia
phone designers have perfected a tactile touchscreen that makes on-screen buttons

DEPARTMENT OF CSE 21 TKREC


HAPTIC TECHNOLOGY

behave as if they were real buttons. When a user presses the button, he or she feels
movement in and movement out. He also hears an audible click.

Nokia engineers accomplished this by placing two small piezoelectric sensor


pads under the screen and designing the screen so it could move slightly when
pressed. Everything, movement and sound is synchronized perfectly to simulate real
button manipulation.

7.2 Surgical Simulation and Medical Training.

Fig:7.2.1 Medical training

Various haptic interfaces for medical simulation may prove especially useful
for training of minimally invasive procedures (laparoscopy/interventional radiology)
and remote surgery using teleoperators. In the future, expert surgeons may work from
a central workstation, performing operations in various locations, with machine setup
and patient preparation performed by local nursing staff. Rather than traveling to an
operating room, the surgeon instead becomes a telepresence. A particular advantage
of this type of work is that the surgeon can perform many more operations of a similar
type, and with less fatigue. It is well documented that a surgeon who performs more
procedures of a given kind will have statistically better outcomes for his patients.
Haptic interfaces are also used in rehabilitation robotics.

7.3 Military Training in virtual environment.

From the earliest moments in the history of virtual reality (VR), the United
States military forces have been a driving factor in developing and applying new VR

DEPARTMENT OF CSE 22 TKREC


HAPTIC TECHNOLOGY

technologies. Along with the entertainment industry, the military is responsible for the
most dramatic evolutionary leaps in the VR field.

Virtual environments work well in military applications. When well designed,


they provide the user with an accurate simulation of real events in a safe, controlled
environment. Specialized military training can be very expensive, particularly for
vehicle pilots. Some training procedures have an element of danger when using real
situations. While the initial development of VR gear and software is expensive, in the
long run it's much more cost effective than putting soldiers into real vehicles or
physically simulated situations. VR technology also has other potential applications
that can make military activities.

Fig:7.3.1 Military Training

Today, the military uses VR techniques not only for training and safety
enhancement, but also to analyze military maneuvers and battlefield positions. In the
next section, we'll look at the various simulators commonly used in military training. -
Out of all the earliest VR technology applications, military vehicle simulations have
probably been the most successful. Simulators use sophisticated computer models to
replicate a vehicle's capabilities and limitations within a stationary -- and safe --
computer station.

DEPARTMENT OF CSE 23 TKREC


HAPTIC TECHNOLOGY

Ground Vehicle Simulators -Although not as high profile as flight simulators, VR


simulators for ground vehicles is an important part of the military’s strategy. In fact,
simulators are a key part of the Future Combat System (FCS) -- the foundation of the armed
forces' future. The FCS consists of a networked battle command system and advanced
vehicles and weapons platforms. Computer scientists designed FCS simulators to link
together in a network, facilitating complex training missions involving multiple participants
acting in various roles.

The FCS simulators include three computer monitors and a pair of joystick
controllers attached to a console. The modules can simulate several different ground vehicles,
including non-line-of-sight mortar vehicles, reconnaissance vehicles or an infantry carrier
vehicle

The Army uses several specific devices to train soldiers to drive specialized vehicles
like tanks or the heavily-armored Stryker vehicle. Some of these look like long-lost twins to
flight simulators. They not only accurately recreate the handling and feel of the vehicle they
represent, but also can replicate just about any environment you can imagine. Trainees can
learn how the real vehicle handles in treacherous weather conditions or difficult terrain.
Networked simulators allow users to participate in complex war games.

DEPARTMENT OF CSE 24 TKREC


HAPTIC TECHNOLOGY

8. LIMITATIONS OF HAPTIC SYSTEMS

Limitations of haptic device systems have sometimes made applying the force’s
exact value as computed by force-rendering algorithms impossible.Various issues
contribute to limiting a haptic device’s capability to render a desired force or, more
often, desired impedance are given below.

1) Haptic interfaces can only exert forces with limited magnitude and not
equally well in all directions, thus rendering algorithms must ensure that no output
components saturate, as this would lead to erroneous or discontinuous application of
forces to the user. In addition, haptic devices aren’t ideal force transducers.

2) An ideal haptic device would render zero impedance when simulating


movement in free space, and any finite impedance when simulating contact with an
object featuring such impedance characteristics. The friction, inertia, and backlash
present in most haptic devices prevent them from meeting this ideal.

3) A third issue is that haptic-rendering algorithms operate in discrete time


whereas users operate in continuous time, as Figure shown below illustrates. While
moving into and out of a virtual object, the sampled avatar position will always lag
behind the avatar’s actual continuous-time position. Thus, when pressing on a virtual
object, a user needs to perform less work than in reality.

And when the user releases, however, the virtual object returns more work than its
real-world counterpart would have returned. In other terms, touching a virtual object
extracts energy from it. This extra energy can cause an unstable response from haptic
devices.

4) Finally, haptic device position sensors have finite resolution. Consequently,


attempting to determine where and when contact occurs always results in a
quantization error. Although users might not easily perceive this error, it can create
stability problems.

All of these issues, well known to practitioners in the field, can limit a haptic
application’s realism. The first two issues usually depend more on the device

DEPARTMENT OF CSE 25 TKREC


HAPTIC TECHNOLOGY

mechanics; the latter two depend on the digital nature of VR applications.FUTURE


VISION

As haptics moves beyond the buzzes and thumps of today’s video games,
technology will enable increasingly believable and complex physical interaction with
virtual or remote objects. Already haptically enabled commercial products let
designers sculpt digital clay figures to rapidly produce new product geometry,
museum goers feel previously inaccessible artifacts, and doctors train for simple
procedures without endangering patients.

Past technological advances that permitted recording, encoding, storage,


transmission, editing, and ultimately synthesis of images and sound profoundly
affected society. A wide range of human activities, including communication,
education, art, entertainment, commerce, and science, were forever changed when we
learned to capture, manipulate, and create sensory stimuli nearly indistinguishable
from reality. It’s not unreasonable to expect that future advancements in haptics will
have equally deep effects.

For the field to move beyond today’s state of the art, researchers must
surmount a number of commercial and technological barriers. Device and software
tool-oriented corporate efforts have provided the tools we need to step out of the
laboratory, yet we need new business models. For example, can we create haptic
content and authoring tools that will make the technology broadly attractive?

Can the interface devices be made practical and inexpensive enough to make
them widely accessible? Once we move beyond single-point force-only interactions
with rigid objects, we should explore several technical and scientific avenues.
Multipoint, multi-hand, and multi-person interaction scenarios all offer enticingly rich
interactivity. Adding sub-modality stimulation such as tactile (pressure distribution)
display and vibration could add subtle and important richness to the experience.
Modeling compliant objects, such as for surgical simulation and training, presents
many challenging problems to enable realistic deformations, arbitrary collisions, and

DEPARTMENT OF CSE 26 TKREC


HAPTIC TECHNOLOGY

topological changes caused by cutting and joining actions.Improved accuracy and


richness in object modeling and haptic.

Development of multimodal workstations that provide haptic, visual, and


auditory engagement will offer opportunities for more integrated interactions. We’re
only beginning to understand the psychophysical and cognitive details needed to
enable successful multimodality interactions. For example, how do we encode and
render an object so there is a seamless consistency and congruence across sensory
modalities—that is, does it look like it feels? Are the object’s densities, compliance,
motion, and appearance familiar and unconsciously consistent with context? Are
sensory events predictable enough that we consider objects to be persistent, and can
we make correct inference about properties? Hopefully we could get bright solutions
for all the queries in the near future itself.

DEPARTMENT OF CSE 27 TKREC


HAPTIC TECHNOLOGY

9.CONCLUSION

We finally conclude that Haptic Technology is the only solution which


provides high range of interaction that cannot be provided by BMI or virtual reality.
Whatever the technology we can employ, touch access is important till now. But,
haptic technology has totally changed this trend. We are sure that this technology will
make the future world as a sensible one.

DEPARTMENT OF CSE 28 TKREC


HAPTIC TECHNOLOGY

10. REFERENCES

1. http://owl.english.prudue.edu/owl/resource/560/01

2. Elgan, M. (2009). Haptics: the Feel-Good Technology of the Year. CIO.

3. http://www.cio.com/article/498068/Haptics_the_Feel_Good_Technology_of_t
he_Year?page=1&taxonomyId=3234

4. Isar,A., & Puoupyrey, I. (2010, April). Surround Haptics: Immersive Tactile


Experiences.

5. http://www.disneyresearch.com/research/projects/hci_surround_haptics_drp.ht
m

DEPARTMENT OF CSE 29 TKREC

You might also like