GraphVR A Virtual Reality Tool For The Exploration of Graphs With HTC Vive System
GraphVR A Virtual Reality Tool For The Exploration of Graphs With HTC Vive System
GraphVR: A Virtual Reality Tool for the Exploration of Graphs with HTC Vive
System
Abstract—This work examines the use of virtual reality can be beneficial for information visualization tasks in
to visualize and interact with three-dimensional graphs. We general is still an open question [2], [3], [4], but 3D graph
developed the design to be as natural and intuitive as possible, visualizations in particular offer great benefits [5], [6].
through analysis, study, and use of several layout algorithms,
which allow the nodes of a graph to be positioned to reduce Today, the representation of 3D space is possible using
entropy levels. We chose to use this schematic visualization of virtual reality (VR). Although this technology was first
graphs because it can describe the graphical data synthetically developed in 1970, it has only recently become widely
and the links between the data while presenting the data in available through low-cost devices such as Oculus Rift [7]
a visual format. The application was developed entirely using and HTC Vive [8]. In a virtual environment, the user’s
Unreal Engine version 4, and the visualization was performed
using the head-mounted display HTC Vive. location is the focal point of the scene, and there is freedom
in the user’s viewing direction because the entire sphere of
Keywords-Virtual Reality; Graph Exploration; Human Com- directionality around that point is available. Visibility of the
puter Interaction;
scene from the perspective of the user’s location is vital.
However, in most VR applications, user interaction is based
I. I NTRODUCTION
on an input device such as keyboard, mouse, or joystick.
Recent interest in social networks, software architectures, Interaction in 3D space within virtual environments presents
planning, and scheduling has led to the application of graph various challenges. Because users of VR devices lose sight
visualization and exploration to assist analysts with relevant of their hands, traditional mouse and keyboard interaction is
visual cues to understand the intrinsic structure of data. limited, and so actions take place in the virtual environment
The amount of such structured data is constantly growing, without using a toolbar or a menu.
and graph visualizations can help their comprehension by Although several works have explored stereoscopic graph
showing hidden structures and other interesting topological visualization [5], [6], [9], very few empirical studies have
features in graphical format. Numerous effective approaches investigated the applicability of VR in the field of graph
are used to visualize a graph in two-dimensional (2D) space. visualization and interaction [10]. One important aspect of
Graph drawing uses an algorithm to address the problem graph interaction is that it enables users to make changes to
of constructing an automatic visualization of graphs, the the graph drawing based on users’ input. This is particularly
goal being to derive an aesthetically pleasing picture that useful when the graph is very large, because, on the one
follows the layout convention of a given application domain. hand, it reduces the difficulty and the time it takes the user to
However, humans have an inherent ability to understand understand the information represented by the graph, while
representations of objects in three-dimensional (3D) space on the other hand, it increases the amount of information
rather than 2D space [1]. that can be interpreted and understood by the user. In 2D
3D graph visualization is a relatively new field. Having space interaction, completing a generic action requires at
one extra dimension enables the visualization of complex least two steps: click a specific button located outside the
systems in which navigation techniques, graph structures, graph region, and then move the mouse cursor back inside
and interface solutions play major roles. In particular, with the graph region to execute the desired action. This two-step
real-time 3D exploration and interaction, users can navigate paradigm is disadvantageous because the user is forced to
a graph and observe it from different points of view, move move in and out of the graph region repeatedly, thus limiting
one or more nodes, or group unimportant nodes into clusters interactivity within the graph.
to reduce information overload. In this way, because the In this paper, we present a virtual reality system to visual-
user can navigate more effectively, 3D graph visualization is ize and interact with three-dimensional graphs. In particular,
intuitively understandable and provides further information we developed an application that uses the Unreal Engine and
about a graph’s hierarchical structure. Whether 3D space the head-mounted display HTC Vive that enables the user
449
Authorized licensed use limited to: Universite de Technologie de Troyes. Downloaded on March 04,2024 at 09:30:50 UTC from IEEE Xplore. Restrictions apply.
the base class for an object that can be positioned inside positions using the layout algorithm selected by the user at
the scene. The base scene of the system is composed of runtime.
a Plane Actor and Volume Cube; the cube contains the
objects that simulate the virtual room. Lighting inside the IV. I NTERACTIVE D ESIGN
scene is performed through the actors called BPSkySphere The GraphVR tool we designed is built from the scratch
and SkyLight, which adjust the general properties of the using using Unreal Engine version 4 and C++ language.
surrounding environment. Inside the room are two of the To visualize the graph in 3D space we used a common
most important actors: (i)PlayerStart, which indicates the layout graph algorithm based on a Force-Directed [19]. The
position in the scene where the Character will be created; purpose of this algorithm is to position the nodes of the
and (ii)PrincipalBP, a Blueprints script associated with a graph in 3D space so that all the edges are of more or less
C++ class called Principal, which represents the software equal length and there are as few crossing edges as possible.
system core. Although Unreal Engine 4 allows one single During the rendering of the 3D graph, the tool allows a user
class per level to spawn the actor, this task is performed to freely explore and interact with a 3D visualization. In
by the Principal class, which is an actor without a mesh particular, the graph is visualized in real-time and the user
or other graphical components and creates all the objects can navigate around the graph using a free-fly 3D camera.
relative to the programming and visualization section. A The user can move inside the graph and also go through all
graph was built through Graph Exchange XML Format the nodes. The interaction method of the graph exploration
(GEXF) file parsing, which contains details about the visual in the 3D space is based on the computer games interaction
components (nodes and edges) and their spatial information. where control is mostly reduced to operating the standard
GEXF can keep the graph compatible with the visualization input devices such as mouse/keyboard. In our case, we use
realized by Gephi. Unreal Engine 4 can use several plugins the HTC Vive Controllers to interact with the 3D graph and
to develop different types of features; the GEXF file parsing to move the camera. The rationale is that over the course
is performed using a plugin called XmlParser. This plugin of the development of the genre, this interaction method has
is instanced through the Principal class and can be used been refined and generally is fairly standardized. In addition,
to check the validity of the GEXF file. After reading the we identified in the design phase as requirement a fast
file, the system iterates the node’s list and the edges located interactions, hand-eye coordination and reaction speed which
inside it, subsequently creating the Model that contains the is a primary model in action games [20]. Such approach
visual and spatial components, created through two classes enables a 3D human-computer interaction in which the
called ModelNode and ModelEdge. Each element of the user’s tasks are performed directly in the 3D spatial context
model is subsequently associated with the corresponding [21]. As for instance, moving, resizing, and grouping nodes
actor to visualize it inside the scene. Actors called Node are performed directly in the 3D environment going over
and Edge (C++ classes) are associated with the classes the classic graphical control elements where interactions
NodeBP and EdgeBP, respectively. The structure of the between humans and machines occur.
NodeBP script is represented by a tree of components,
whose root is called Root Component and is attached to
a Static Mesh represented by a spherical mesh, a customiz- V. G RAPH VR
able material, and a component (Floating Pawn Movement) We designed GraphVR with the aim of offering users a
able to interpret the movements applied to the actor and more natural and intuitive level of interaction. Interaction
transmit them to its own mesh. To the Static Mesh is also with the components in a scene is made possible through
linked a Billboard, implemented using Blueprints with a Text the HTC Vive Controllers. To query the scene, we use a
component, which can make the node identity continually technique called raycasting [22]. One ray starts from each
visible to the user by rotating the text component based on controller along the direction of its forward vector to the
the user’s viewing angle. Inside the scene, nodes do not previously set end location. The ray is constantly shown
suffer the effect of gravity, and collisions are disabled to through a Cable Component. A viewfinder is used to target
allow the Character to cross them without being blocked. and interact with the objects in the scene. Moving the
EdgeBP is an actor composed using a native Unreal Engine controllers in the scene also results in the motion of the ray
4 plugin; it is structured with a Static Mesh applied at attached to it. The scene is constantly queried, and when the
the Spline component, and it can use two properties that ray hits an Actor, the system checks whether it is an element
allow it to connect to two actors (nodes). Finally, two other with which it is possible to interact and eventually activate
important classes make up our software system: Avatar class further features.
and Layout class. The first class is composed of Character, When a controller’s grip button is pressed, the Root
a special actor able to directly receive the user’s input and Component of each node is attached to one Actor. When the
encode it to interact through the virtual environment. The controller is moved, the distance between its initial and final
Layout class is responsible for computing the graph node position is computed as a vector just before the rendering of
450
Authorized licensed use limited to: Universite de Technologie de Troyes. Downloaded on March 04,2024 at 09:30:50 UTC from IEEE Xplore. Restrictions apply.
Figure 1. 1. Menu button (shoulder), 2. Up, down, right and left movement
(left trackpad), 3. Unused (system button), 4. Details of the node (trigger), 5.
Move or scale the graph (grip button), 6. Rotate the graph (right thumbstick)
451
Authorized licensed use limited to: Universite de Technologie de Troyes. Downloaded on March 04,2024 at 09:30:50 UTC from IEEE Xplore. Restrictions apply.
that allows the avatar to ignore gravity and simulate flight. VI. C ONCLUSION AND F UTURE W ORKS
This allows the user to examine the graph from above or
GraphVR supports graphs of small to medium dimensions
another preferred point of view. When gravity is disabled,
only, typically with a maximum value of around 300 nodes.
the user can move around in the scene simply by looking in
This limit is due to the management of the number of
the desired direction and by pressing the directional buttons.
Actors, one for each node. It is also necessary to consider
We implemented two types of layouts, which can be selected
all actors that represent the arcs, lights, avatar, billboard, and
by the user. The first is called Base Layout [24] (Figure 4)
the rest of the scene. Because the computing complexity is
and is similar to Gephi’s 2D layout. The second layout is
very high, it was necessary to find a compromise between
called Force-Directed [19] (Figure 5) and is implemented in
rendering quality and system fluidity by reducing the level
3D.
of detail [25] of the actors. The idea in the future is to
continue to develop this application by using different types
of layout algorithms. More important we are going to include
real empirical studies to investigate how this proposed work
contributes to research and development communities that
use VR. In particular, we are planning an extensive and
representative experimental study involving a large sample
of people with more diversified technological skills as well
as with no knowledge of the whole setting. The aim is
to study how non-gamers users, as well as users who are
not familiar with the considered domain, would react to the
GraphVR tool.
R EFERENCES
[1] U. Erra, G. Scanniello, and N. Capece, “Visualizing the
Evolution of Software Systems Using the Forest Metaphor,”
in 2012 16th International Conference on Information Visu-
alisation, July 2012, pp. 87–92.
[2] R. Brath, “3D InfoVis is here to stay: Deal with it,” in 2014
Figure 4. Base layout. IEEE VIS International Workshop on 3DVis (3DVis), Nov
2014, pp. 25–31.
452
Authorized licensed use limited to: Universite de Technologie de Troyes. Downloaded on March 04,2024 at 09:30:50 UTC from IEEE Xplore. Restrictions apply.
[9] B. Alper, T. Hollerer, J. Kuchera-Morin, and A. Forbes, [23] G. Caggianese, L. Gallo, and P. Neroni, Design and Pre-
“Stereoscopic Highlighting: 2D Graph Visualization on Stereo liminary Evaluation of Free-Hand Travel Techniques for
Displays,” IEEE Transactions on Visualization and Computer Wearable Immersive Virtual Reality Systems with Egocentric
Graphics, vol. 17, no. 12, pp. 2325–2333, Dec. 2011. Sensing. Cham: Springer International Publishing, 2015, pp.
399–408.
[10] O. H. Kwon, C. Muelder, K. Lee, and K. L. Ma, “A Study
of Layout, Rendering, and Interaction Methods for Immersive [24] G. D. Battista, P. Eades, R. Tamassia, and I. G. Tollis,
Graph Visualization,” IEEE Transactions on Visualization and “Algorithms for drawing graphs: an annotated bibliography,”
Computer Graphics, vol. 22, no. 7, pp. 1802–1815, July 2016. Computational Geometry, vol. 4, no. 5, pp. 235 – 282, 1994.
[11] M. Bastian, S. Heymann, and M. Jacomy, “Gephi: [25] D. P. Luebke, “A developer’s survey of polygonal
An Open Source Software for Exploring and simplification algorithms,” IEEE Comput. Graph. Appl.,
Manipulating Networks,” 2009. [Online]. Available: vol. 21, no. 3, pp. 24–35, May 2001. [Online]. Available:
http://www.aaai.org/ocs/index.php/ICWSM/09/paper/view/154 http://dx.doi.org/10.1109/38.920624
[12] P. Shannon, A. Markiel, O. Ozier, N. S. Baliga, J. T.
Wang, D. Ramage, N. Amin, B. Schwikowski, and T. Ideker,
“Cytoscape: a software environment for integrated models
of biomolecular interaction networks,” Genome research,
vol. 13, no. 11, pp. 2498–2504, 2003.
453
Authorized licensed use limited to: Universite de Technologie de Troyes. Downloaded on March 04,2024 at 09:30:50 UTC from IEEE Xplore. Restrictions apply.