FlyingHand Extending The Range of Haptic Feedback
FlyingHand Extending The Range of Haptic Feedback
net/publication/329331018
CITATIONS READS
20 826
4 authors, including:
Daisuke Iwai
Osaka University
151 PUBLICATIONS 2,095 CITATIONS
SEE PROFILE
All content following this page was uploaded by Parinya Punpongsanon on 17 December 2018.
Figure 1: We proposed a system that allows user to employ a drone with the environment exploration task. (a) The control
room where user could issue commands to the drone. (b) The drone would fly and capture the image of the target object. Then,
(c) users could use virtual hand to explore remote environment that is captured from the front camera of a drone. At the same
time, the image captured by the drone would simulate tactile feedback through ultrasound array.
ABSTRACT CCS CONCEPTS
This paper presents a Head Mounted Display (HMD) integrated • Human-centered computing → Mixed / augmented reality;
system that uses a drone and a virtual hand to help the users explore Interaction devices;
remote environment. The system allows the users to use hand ges-
tures to control the drone and identify the Objects of Interest (OOI) KEYWORDS
through tactile feedback. The system uses a Convolutional Neural
Virtual hand illusion, body ownership, haptic perception, human-
Network to perform object classification with the drone captured
drone interaction
image and provides a virtual hand to realize interaction with the
object. Accordingly, tactile feedback is also provided to users’ hands ACM Reference Format:
to enhance the virtual hand body ownership. The system aims to Tinglin Duan, Parinya Punpongsanon, Daisuke Iwai, and Kosuke Sato. 2018.
help users assess space and objects regardless of body limitations, FlyingHand: Extending the Range of Haptic Feedback on Virtual Hand using
which could not only benefit elderly or handicapped people, but Drone-based Object Recognition. In SIGGRAPH Asia 2018 Technical Briefs
make potential contributions in environment measurement and (SA ’18 Technical Briefs), December 4–7, 2018, Tokyo, Japan. ACM, New York,
daily life as well. NY, USA, 4 pages. https://doi.org/10.1145/3283254.3283258
1 INTRODUCTION
Permission to make digital or hard copies of all or part of this work for personal or
classroom use is granted without fee provided that copies are not made or distributed The range in which human could feel and examine the world is
for profit or commercial advantage and that copies bear this notice and the full citation sometimes limited by the body size and movement capabilities, es-
on the first page. Copyrights for components of this work owned by others than the pecially for aging or disabilities groups. When a certain individual
author(s) must be honored. Abstracting with credit is permitted. To copy otherwise, or
republish, to post on servers or to redistribute to lists, requires prior specific permission wants to examine the surrounding environment, the most intuitive
and/or a fee. Request permissions from [email protected]. way is to walk around, use the eyes to view and use the hands
SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan to feel the objects in the environment. However, there are count-
© 2018 Copyright held by the owner/author(s). Publication rights licensed to ACM.
ACM ISBN 978-1-4503-6062-3/18/12. . . $15.00 less scenarios, where users are either inconvenient to move or the
https://doi.org/10.1145/3283254.3283258 environment itself is not suitable or reachable for users to explore.
SA ’18 Technical Briefs, December 4–7, 2018, Tokyo, Japan T. Duan et al.
Virtual Reality (VR) has brought up the virtual body extension 2.2 Virtual Hand and Body Ownership
applications to reach and explore the virtual world. By fitting users’ Multiple attempts have been performed to help users reach the
actual hand to the existing hand model in virtual reality, users unreachable world with the help of a virtual hand. Ueda et al. pro-
could explore the environment by extending the length of the arm posed an extended hand interaction that enables users to manipulate
virtually [Ueda et al. 2017; Yamashita et al. 2018]. However, this home appliance and make human-human communication with a
virtual body extension technique is still limited in terms of explo- projected virtual hand[Ueda et al. 2017]. The model allows users
ration range and its body ownership. Thus, it is only applicable in to use a touch panel to control the virtual hand movement while
surrounding environment exploration while users treat the virtual preserving some hand gestures on the virtual model. Instead of
hand as an auxiliary device. Users could not take advantage of the using projection-based visualization, Feuchtner and Muller com-
technique when they want to reach a further location, or get more bines real world environment with the virtual one [Feuchtner and
texture details from the interaction object. Müller 2017]. The design records the real world objects locations
To extend such limitation, in this paper, we proposed a Head- into virtual world while fitting a user’s hand with a virtual hand
Mounted Display (HMD) integrated systems that allows users to model. They allow users to manipulate the virtual hand in the vir-
reach an unreachable world without physically moving its pre- tual environments (VEs), whose results would in turn be updated
sented. A drone and the virtual hand act as extra eyes and hand back to real world to make the physical object move accordingly.
for users, where the drone will see the object and the virtual hand However, the systems are still restrained by the space and the body
would help the users to examine and feel the target objects. The ownership is yet limited. This extension span only suits the need for
design mainly aims at making the drone control part as intuitive nearby objects manipulation and the extension of virtual hand more
as possible, while improving the body ownership of the virtual than 150cm may lose the sense of body ownership [Kilteni et al.
hand. The systems consist of three separate parts. First, the user 2012]. Our proposed system introduces drone and tactile feedback
could use hand gestures to control the drone and image captures to the expendable virtual hand design, which not only improves the
of the target object (Figure 1(a)). In parallel, the drone location is application range, but also allows perceived body ownership. Drone
tracked and synchronized into the HMD (Figure 1(b)). Second, the exploration makes it possible for users to interact with objects more
image captures would then be sent back to a system and perform than 150cm from them, while the haptic feedbacks intensifies users’
classification with a trained Neural Network. Third, the user could feeling that the hand extension is actually one part of their body.
virtually manipulate the target object with the virtual hand while
perceiving according haptic feedback (Figure 1(c)). 3 SYSTEM DESIGN
To summarize, our paper makes the following contribution.
3.1 Assumption
• A system that introduces drone control and tactile feedback
to the virtual hand, allowing users to manipulate and per- We assume the use case of the remote task scenario where the target
ceive both virtual and haptic feedbacks. object is difficult to physically reach by the user such as in the cave
• Performed a user study on the drone control as well as the or above the shelf. Thus, it requires a remote interaction technique
virtual hand body ownership such as drone to explore the scene.
Figure 3: System implementation. (a) A tracking setup with a Figure 4: The diagram indicates the separate controller for
drone, and (b) a Tello drone assembling with retro-reflective drone and virtual hand.
markers for tracking capability.
5 APPLICATIONS
From the experimental result, we confirm that the proposed system
improves the experiences of body ownership with virtual hand. We
envision the application of this system, which could be a virtual
experience of museums as is shown in Figure 2. This application
could erase the distance limit between visitors and museums. The
visitors could rent a museum drone remotely, which has the classi-
fication ability to identify different paintings and sculptures. The
users could use their hands to control the drone fly around the mu-
seum and stop at the sculpture of their interest. Then, the visitors
Figure 5: Boxplots of user feedbacks for each question on a could manipulate and examine the details of the arts with their
five-point-Likert scale. virtual hands, while receiving tactile feedback based on arts details.