0% found this document useful (0 votes)
12 views5 pages

FlyingHand Extending The Range of Haptic Feedback

Uploaded by

kamleshsisodia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views5 pages

FlyingHand Extending The Range of Haptic Feedback

Uploaded by

kamleshsisodia
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 5

See discussions, stats, and author profiles for this publication at: https://www.researchgate.

net/publication/329331018

FlyingHand: extending the range of haptic feedback on virtual hand using


drone-based object recognition

Conference Paper · December 2018


DOI: 10.1145/3283254.3283258

CITATIONS READS

20 826

4 authors, including:

Tinglin Duan Parinya Punpongsanon


University of Toronto Saitama University
5 PUBLICATIONS 45 CITATIONS 60 PUBLICATIONS 897 CITATIONS

SEE PROFILE SEE PROFILE

Daisuke Iwai
Osaka University
151 PUBLICATIONS 2,095 CITATIONS

SEE PROFILE

All content following this page was uploaded by Parinya Punpongsanon on 17 December 2018.

The user has requested enhancement of the downloaded file.


FlyingHand: Extending the Range of Haptic Feedback on Virtual
Hand using Drone-based Object Recognition
Tinglin Duan Parinya Punpongsanon
[email protected] Daisuke Iwai
University of Toronto Kosuke Sato
Toronto, Ontario [email protected]
[email protected]
[email protected]
Osaka University
Toyonaka, Osaka

Figure 1: We proposed a system that allows user to employ a drone with the environment exploration task. (a) The control
room where user could issue commands to the drone. (b) The drone would fly and capture the image of the target object. Then,
(c) users could use virtual hand to explore remote environment that is captured from the front camera of a drone. At the same
time, the image captured by the drone would simulate tactile feedback through ultrasound array.
ABSTRACT CCS CONCEPTS
This paper presents a Head Mounted Display (HMD) integrated • Human-centered computing → Mixed / augmented reality;
system that uses a drone and a virtual hand to help the users explore Interaction devices;
remote environment. The system allows the users to use hand ges-
tures to control the drone and identify the Objects of Interest (OOI) KEYWORDS
through tactile feedback. The system uses a Convolutional Neural
Virtual hand illusion, body ownership, haptic perception, human-
Network to perform object classification with the drone captured
drone interaction
image and provides a virtual hand to realize interaction with the
object. Accordingly, tactile feedback is also provided to users’ hands ACM Reference Format:
to enhance the virtual hand body ownership. The system aims to Tinglin Duan, Parinya Punpongsanon, Daisuke Iwai, and Kosuke Sato. 2018.
help users assess space and objects regardless of body limitations, FlyingHand: Extending the Range of Haptic Feedback on Virtual Hand using
which could not only benefit elderly or handicapped people, but Drone-based Object Recognition. In SIGGRAPH Asia 2018 Technical Briefs
make potential contributions in environment measurement and (SA ’18 Technical Briefs), December 4–7, 2018, Tokyo, Japan. ACM, New York,
daily life as well. NY, USA, 4 pages. https://doi.org/10.1145/3283254.3283258

1 INTRODUCTION
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed The range in which human could feel and examine the world is
for profit or commercial advantage and that copies bear this notice and the full citation sometimes limited by the body size and movement capabilities, es-
on the first page. Copyrights for components of this work owned by others than the pecially for aging or disabilities groups. When a certain individual
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission wants to examine the surrounding environment, the most intuitive
and/or a fee. Request permissions from [email protected]. way is to walk around, use the eyes to view and use the hands
SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan to feel the objects in the environment. However, there are count-
© 2018 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ACM ISBN 978-1-4503-6062-3/18/12. . . $15.00 less scenarios, where users are either inconvenient to move or the
https://doi.org/10.1145/3283254.3283258 environment itself is not suitable or reachable for users to explore.
SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan T. Duan et al.

and accuracy are yet limited as their sensors are assembled at a


low viewpoint. Compared with ground robots, drones’ viewpoints
are less restricted as they could fly over the obstacles and provide
real-time data from high above. Drone also have the capabilities for
environment monitoring, it’s provided a relatively high flexibility
and adaptivity [Manfreda et al. 2018]. Delmerico et al. proposed
a drone to help ground robots make real-time path planning and
localization [Delmerico et al. 2017; Käslin et al. 2016]. Karjalainen
et al. showed that participants preferred a drone companion in a
home environment [Karjalainen et al. 2017]. During the experiment
participants treated drones as life assistants that could help them
Figure 2: Proposed concept: the system could erase the dis- accomplish certain tasks. In addition, J. Cauchard et al suggested
tance limit between visitor and museums. that hand gesture commands are intuitive when interacting with
drones[Cauchard et al. 2015].

Virtual Reality (VR) has brought up the virtual body extension 2.2 Virtual Hand and Body Ownership
applications to reach and explore the virtual world. By fitting users’ Multiple attempts have been performed to help users reach the
actual hand to the existing hand model in virtual reality, users unreachable world with the help of a virtual hand. Ueda et al. pro-
could explore the environment by extending the length of the arm posed an extended hand interaction that enables users to manipulate
virtually [Ueda et al. 2017; Yamashita et al. 2018]. However, this home appliance and make human-human communication with a
virtual body extension technique is still limited in terms of explo- projected virtual hand[Ueda et al. 2017]. The model allows users
ration range and its body ownership. Thus, it is only applicable in to use a touch panel to control the virtual hand movement while
surrounding environment exploration while users treat the virtual preserving some hand gestures on the virtual model. Instead of
hand as an auxiliary device. Users could not take advantage of the using projection-based visualization, Feuchtner and Muller com-
technique when they want to reach a further location, or get more bines real world environment with the virtual one [Feuchtner and
texture details from the interaction object. Müller 2017]. The design records the real world objects locations
To extend such limitation, in this paper, we proposed a Head- into virtual world while fitting a user’s hand with a virtual hand
Mounted Display (HMD) integrated systems that allows users to model. They allow users to manipulate the virtual hand in the vir-
reach an unreachable world without physically moving its pre- tual environments (VEs), whose results would in turn be updated
sented. A drone and the virtual hand act as extra eyes and hand back to real world to make the physical object move accordingly.
for users, where the drone will see the object and the virtual hand However, the systems are still restrained by the space and the body
would help the users to examine and feel the target objects. The ownership is yet limited. This extension span only suits the need for
design mainly aims at making the drone control part as intuitive nearby objects manipulation and the extension of virtual hand more
as possible, while improving the body ownership of the virtual than 150cm may lose the sense of body ownership [Kilteni et al.
hand. The systems consist of three separate parts. First, the user 2012]. Our proposed system introduces drone and tactile feedback
could use hand gestures to control the drone and image captures to the expendable virtual hand design, which not only improves the
of the target object (Figure 1(a)). In parallel, the drone location is application range, but also allows perceived body ownership. Drone
tracked and synchronized into the HMD (Figure 1(b)). Second, the exploration makes it possible for users to interact with objects more
image captures would then be sent back to a system and perform than 150cm from them, while the haptic feedbacks intensifies users’
classification with a trained Neural Network. Third, the user could feeling that the hand extension is actually one part of their body.
virtually manipulate the target object with the virtual hand while
perceiving according haptic feedback (Figure 1(c)). 3 SYSTEM DESIGN
To summarize, our paper makes the following contribution.
3.1 Assumption
• A system that introduces drone control and tactile feedback
to the virtual hand, allowing users to manipulate and per- We assume the use case of the remote task scenario where the target
ceive both virtual and haptic feedbacks. object is difficult to physically reach by the user such as in the cave
• Performed a user study on the drone control as well as the or above the shelf. Thus, it requires a remote interaction technique
virtual hand body ownership such as drone to explore the scene.

3.2 System Architecture


2 RELATED WORKS
3.2.1 Hardware Setup. We separate the implemented system
2.1 Drone for Environment Monitoring into two interaction spaces; the experiment room and control room.
Intelligent and interactive robots could play a great role in environ- The experiment room is consisted of a tracking system (OptiTrack
ment monitoring and measurement. Teymourzadeh et al. adapted Flex13 cameras) and a drone (Tello Drone by Ryze Tech Inc.). In
the spider robot with multiple sensors [Teymourzadeh et al. 2013]. order to track the system, the drone assembles retro-reflective mark-
The legged robot proved high efficiency in path planning and ex- ers. We implement our virtual hand with the Unity3D software and
ploration tasks. However, the ground robots’ exploration range feed the camera image as the background where the user could
FlyingHand SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan

Figure 3: System implementation. (a) A tracking setup with a Figure 4: The diagram indicates the separate controller for
drone, and (b) a Tello drone assembling with retro-reflective drone and virtual hand.
markers for tracking capability.

command to the virtual hand. By this interaction metaphor, it allows


observe through an HMD. We assume that the target object used in the user to perceive force feedback from the haptic feedback device
this study is a sphere and a cube. The sphere has a radius of 1.86cm, that locates on the front of hand tracking.
and the cube has an edge of 3cm. Both target objects have similar
volumes as 26.95cm 3 and 27cm3 , respectively. Figure 3 shows the 3.2.3 Estimation of Remote Environments. Since, the camera
implementation of the experiment room. only feeds the captured images back to the system, it is not possible
The control room consists of a force feedback device (Ultra- to estimate the distance between target object location and the
haptics Array Development Kits) embedded with hand tracking drone location. To solve this problem, we estimate the distance from
system (Leap Motion) and an HMD Display (Oculus Rift DK2). All the drone’s position to the target object from our external tracking
equipment is connected to a PC (Intel Core i7, 64GB RAM, Nvidia system. Thus, we can accurately calculate the drone position and
Geforce GTX970 4GB RAM) and communicates through Python the range of virtual hand. As shown in Figure 1(c), we realize the
script. Users could view all the objects through the drone front extension of a hand by adopting a relative linear model with the
camera from an HMD, and use a real hand to control the drone. hand model. Taking the actual hand position for each frame a, the
The hand gesture is tracked with Leap Motion tracking and it’s virtual hand position v and the gain д, we update the virtual hand
directly mapped to the drone movement. At the same time, the position as follows:
tactile feedback is generated by the Ultrahaptics array according to
the virtual hand location and the target object. The control room vt = vt −1 + д ∗ (at − at −1 )
issue commands to the drone (in the experiment room) through where t and t − 1 are a current frame and previous frame, respec-
wireless network, and the drone transmit the location information tively. The gain value could be tuned to meet different needs.
back via the same network.
3.2.4 Image Classification for Haptic Mapping. Once the drone
3.2.2 Controller. In order to make the best use of human body
finds the target object, it takes an image and transmits back to the
and treat the drone as an extension of user’s hand, we use Leap
PC through a network. Then, we apply a color filtering to remove
Motion to issue the control commands to the drone. The movement
the background noise and feed its sequences to the Convolutional
controls are set by the relative positions of user’s hand and the
Neural Network (CNN) to derive classification of a target object.
Leap Motion. For instance, when user place their hand to the left
We use the CNN with 5-layers architecture, each consisting of a 2D
of the Leap Motion, the system will send commands to the drone
convolution layer and a 2D max pooling layer. We sampled a total
to move left. Since the movement of drone is slower than how
of 12400 images for training set, and 2000 images for validation set.
interactively of human gesture, we implement the transfer function
We found the accuracy of 98.64% and 96.5% for training set and
f that indicate the relative thresholds between the command input
validation set, respectively.
x and the physical movement f (x) of the drone as follows:
(
0 if x ∈ [0, α] 4 EVALUATION
f (x) = x −α  β
1−α if x ∈ [α, 1] 4.1 Procedure
where α controls the beginning of the drone motion and β controls We conducted a user experiment to evaluate the effectiveness of the
the slope of the function. proposed method to perceive body ownership and the usefulness
To separate the controller between drone and virtual hand, we of the proposed drone control method. The evaluation is separated
implement an angle threshold that handles whether moving the into two parts: First, the participants were asked to control the
drone or the virtual hand. Figure 4 shows the diagram of separate drone with their hand gestures to move forward and backward. The
controller for drone and virtual hand. When the user reaches their participants observed the drone movement via the HMD. Second,
hand within the 45deg the system will send command to the drone, the participants were asked to use the virtual hand to touch the
while user reaches the hand more than 45deg the system send the target object.
SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan T. Duan et al.

5 APPLICATIONS
From the experimental result, we confirm that the proposed system
improves the experiences of body ownership with virtual hand. We
envision the application of this system, which could be a virtual
experience of museums as is shown in Figure 2. This application
could erase the distance limit between visitors and museums. The
visitors could rent a museum drone remotely, which has the classi-
fication ability to identify different paintings and sculptures. The
users could use their hands to control the drone fly around the mu-
seum and stop at the sculpture of their interest. Then, the visitors
Figure 5: Boxplots of user feedbacks for each question on a could manipulate and examine the details of the arts with their
five-point-Likert scale. virtual hands, while receiving tactile feedback based on arts details.

We conducted a qualitative evaluation with 11 participants (10 6 CONCLUSION


males, 1 female, 20 to 29-year old). Only one participant had expe- An HMD integrated system has been proposed to help users explore
rience with force feedback device, two participants had experience and feel the surrounding and remote environment. The system
with HMD. All participants had or corrected to normal vision. allows users to treat drone as extra eyes and hand, by controlling it
with hand gestures remotely. Neural Network technique has been
4.2 Questionnaire applied to analyze drone captured information. Users could get
Once participants finished the experiment they are asked to fill a haptic feedback on their palms based on the drone captured data,
questionnaire to give feedbacks on the experience with the pro- which helps them to get a sense of the remote objects.
posed system. The questionnaires have been designed as follows:
ACKNOWLEDGMENTS
Q1: I felt my hand movement was causing the movement of the drone This work has been supported by FrontierLab@OsakaU and the
Q2: I felt my hand gesture could make the drone to move constantly JSPS KAKENHI under grant numbers JP15H05925 and JP16H02859.
Q3: I felt my hand movement was causing the movement of the virtual hand
Q4: I felt the virtual hand was part of my body REFERENCES
Jessica R. Cauchard, Jane L. E, Kevin Y. Zhai, and James A. Landay. 2015. Drone
& Me: An Exploration into Natural Human-drone Interaction. In 2015 ACM
The participants rate from 1: strongly disagree to 5: strongly agree. International Joint Conference on Pervasive and Ubiquitous Computing (UbiComp
’15). ACM, 361–365. https://doi.org/10.1145/2750858.2805823
4.3 Result J. Delmerico, E. Mueggler, J. Nitsch, and D. Scaramuzza. 2017. Active Autonomous
Aerial Exploration for Ground Robot Path Planning. IEEE Robotics and Automation
Figure 5 shows the subjective score obtained for each question. We Letters 2, 2 (April 2017), 664–671. https://doi.org/10.1109/LRA.2017.2651163
confirm that our subjective questionnaire has a good reliability T. Feuchtner and J. Müller. 2017. Extending the Body for Interaction with Reality.
In 2017 CHI Conference on Human Factors in Computing Systems (CHI ’17). ACM,
using the Cronbash’s alpha test (α = 0.702). Overall, the questions 5145–5157. https://doi.org/10.1145/3025453.3025689
were better ranked for Q1 "I felt my hand movement was causing Kari D. Karjalainen, Anna Elisabeth S. Romell, P. Ratsamee, Asim E. Yantac, M. Fjeld,
the movement of the drone" (avg. 4.0), Q3 "I felt my hand movement and M. Obaid. 2017. Social Drone Companion for the Home Environment: A User-
Centric Exploration. In 5th International Conference on Human Agent Interaction
was causing the movement of the virtual hand" (avg. 4.36) and Q4 "I (HAI ’17). ACM, 89–96. https://doi.org/10.1145/3125739.3125774
felt the virtual hand was part of my body" (avg. 4.09). R. Käslin, P. Fankhauser, E. Stumm, Z. Taylor, E. Mueggler, J. Delmerico, D. Scaramuzza,
R. Siegwart, and M. Hutter. 2016. Collaborative localization of aerial and ground
robots through elevation maps. In 2016 IEEE International Symposium on Safety,
4.4 Discussion Security, and Rescue Robotics (SSRR). 284–290. https://doi.org/10.1109/SSRR.2016.
7784317
The experiment result shows that the proposed method is easy K. Kilteni, Jean M. Normand, Maria V. Sanchez-Vives, and M. Slater. 2012. Extending
to control the drone movement (Q1). However, the movement di- Body Space in Immersive Virtual Reality: A Very Long Arm Illusion. PLoS ONE 7, 7
rection control is realized by exerting forces to the drones. For (7 2012). https://doi.org/10.1371/journal.pone.0040867
S. Manfreda, Matthew F. McCabe, Pauline E. Miller, R. Lucas, V. Pajuelo Madrigal, G.
example, when a user issues the command ”backward” to a drone Mallinis, E. Ben Dor, D. Helman, L. Estes, G. Ciraolo, J. MÃijllerovÃą, F. Tauro,
being ”forward”, the flying mechanism needs to slow down first M. Isabel de Lima, JoÃčo L. M. P. de Lima, A. Maltese, F. Frances, K. Caylor, M.
before moving in the other direction. Thus, users feel a latency Kohv, M. Perks, G. Ruiz-PÃľrez, Z. Su, G. Vico, and B. Toth. 2018. On the Use of
Unmanned Aerial Systems for Environmental Monitoring. Remote Sensing 10, 4
manipulating the drone, which reflects to the lower score in Q2. (2018). https://doi.org/10.3390/rs10040641
For the improvement of body ownership using proposed method, R. Teymourzadeh, Rahim Nakhli Mahal, Ng Keng Shen, and Kok Wai Chan. 2013.
we found that the tactile feedback provides great experience en- Adaptive intelligent spider robot. In 2013 IEEE Conference on Systems, Process Control
(ICSPC). 310–315. https://doi.org/10.1109/SPC.2013.6735153
hancement when users interact with the target object, which reflects Y. Ueda, Y. Asai, R. Enomoto, K. Wang, D. Iwai, and K. Sato. 2017. Body Cyberization
to the higher score in Q3. The body ownership of the virtual hand by Spatial Augmented Reality for Reaching Unreachable World. In 8th Augmented
Human International Conference (AH ’17). ACM, Article 19, 9 pages. https://doi.
could be thus concluded to have improved with the introduction of org/10.1145/3041164.3041188
tactile feedback. However, with the limited degree of freedom of the S. Yamashita, R. Ishida, A. Takahashi, H. Wu, H. Mitake, and S. Hasegawa. 2018. Gum-
tactile representation of the device, it is noticed that the user might gum Shooting: Inducing a Sense of Arm Elongation via Forearm Skin-stretch and
the Change in the Center of Gravity. In ACM SIGGRAPH 2018 Emerging Technologies
lose the body ownership once the distance between the actual hand (SIGGRAPH ’18). ACM, Article 8, 2 pages. https://doi.org/10.1145/3214907.3214909
and tactile feedback is different.

View publication stats

You might also like