A Framework For Process Data Collection Analysis and Visualizat PDF
A Framework For Process Data Collection Analysis and Visualizat PDF
STARS
2012
This Masters Thesis (Open Access) is brought to you for free and open access by STARS. It has been accepted for
inclusion in Electronic Theses and Dissertations, 2004-2019 by an authorized administrator of STARS. For more
information, please contact [email protected].
STARS Citation
Akhavian, Reza, "A Framework For Process Data Collection, Analysis, And Visualization In Construction
Projects" (2012). Electronic Theses and Dissertations, 2004-2019. 2177.
https://stars.library.ucf.edu/etd/2177
A FRAMEWORK FOR PROCESS DATA COLLECTION, ANALYSIS, AND
VISUALIZATION IN CONSTRUCTION PROJECTS
by
REZA AKHAVIAN
B.S. University of Tehran, 2010
Summer Term
2012
© 2012 Reza Akhavian
ii
ABSTRACT
Automated data collection, simulation and visualization can substantially enhance the
particular, managing processes that are dynamic in nature can significantly benefit from
such techniques. Construction projects are good examples of such processes where a
with updated information about the status of project entities and assisting site personnel
making critical decisions under uncertainty. To this end, the current practice of using
historical data or expert judgments as static inputs to create empirical formulations, bar
chart schedules, and simulation networks to study project activities, resource operations,
and the environment under which a project is taking place does not seem to offer reliable
results.
modeling framework capable of collecting and analyzing real time field data from
construction equipment. In the developed data collection scheme, a stream of real time
activities, and update the contents of a discrete event simulation (DES) model
representing the real engineering process. The generated data-driven simulation model is
iii
an effective tool for projecting future progress based on existing performance.
Ultimately, the developed framework can be used by project decision-makers for short-
term project planning and control since the resulting simulation and visualization are
iv
ACKNOWLEDGMENTS
The present document is my final Master’s Thesis. This Thesis describes my graduate
I would like to thank many people in my life who have contributed to my academic
journey. Above all, my sincere appreciation is due to my advisor Dr. Amir H. Behzadan
for his continuous support, endless patience and unstinting encouragement during my
Master’s studies and in the course of this research. He is truly a dedicated teacher and
incredible advisor. Thank you for believing in me, giving me the opportunity to succeed,
researcher.
My special thanks to my other committee members, Dr. Oloufa and Dr. Tatari who
offered guidance and support. I am always indebted to Dr. Oloufa for his motivational
lessons about construction engineering and management during the very first semesters I
There are some other people that I am deeply grateful to them for their presence in my
life; my lovely parents Hossein Akhavian and Maryam Batebi who were both teachers
and I have been and will be always their student. Their love and patience have been
always a source of support for me, particularly during the days I am far away from them.
I would also like to appreciate the presence of Negin Alimohammadi, my love, wonderful
v
friend, and great source of emotional support and motivation during my Master’s studies.
I am also thankful to all my friends, especially Saeed Hadian and Sina Zel Taat for their
help and friendship and all nice experience we had during these years.
vi
TABLE OF CONTENTS
LIST OF FIGURE............................................................................................................... x
vii
CHAPTER 4: VISUALIZATION WITH OPENSCENEGRAPH (OSG)........................ 25
viii
6.2 Comprehensive Example: Data-Driven Simulation ........................................... 69
REFERENCES ................................................................................................................. 98
ix
LIST OF FIGURE
Figure 3.2: Mandel et al. Developed DDDAS for real time modeling of Wildfire .......... 21
Figure 4.2. Hierarchical Scene Graph and Relationships between Different Nodes ........ 29
Figure 5.2: Datagram Structure of the PNI TCM Prime Orientation Tracker .................. 37
Figure 5.8: Activity Durations Based on the Variation of Equipment Body Orientation
with Respect to Time (RB = Raise Bucket, LT = Load Truck, LB = Lower Bucket, RTB
Figure 5.9: Single VI Containing Data Acquisition and Data Analysis Functions .......... 50
x
Figure 6.2: A Prime 3D Orientation Tracker Mounted on a Model Excavator with
Figure 6.4: Front Panel of Data Collection VI for a Single Loader .................................. 62
Figure 6.5: Real Time Display of Loader's Boom Movements and Corresponding 3D
Figure 6.6: Orientation Trackers Mounted on a Loader's Boom and a Truck’s Bed ........ 65
Figure 6.7: Front Panel of Data Collection VI for a Double Object Experiment ............. 66
Figure 6.8: Partial View of the Extensive Block Diagram Developed in this Research .. 67
Figure 6.9: Real Time Display of Loader's Boom and Truck’s Bed Movements and
Figure 6.14: Developed VI for Data Collection and Analysis for Rock Hauling Example
........................................................................................................................................... 74
Durations ........................................................................................................................... 75
Figure 6.16: STROBOSCOPE Simulation Output File Based on the Updated Durations 76
xi
Figure A. 3: CreateModel() Function Flowchart......................................................... 88
Figure B. 1: A Customized VI - The Upper Window is the Front Panel and the Bottom
Figure B. 2: VISA Open Opens the Specified Port by the VISA Resource Name ............. 93
Figure B. 3: VISA Write Writes Data to the Specified Port by the VISA Resource Name 94
Figure B. 4: VISA Read Reads Data from the Specified Port by the VISA Resource Name
........................................................................................................................................... 94
Figure B. 5: VISA Close Closes the Specified Port by the VISA Resource Name ............ 94
Figure B. 6: A Series of VISA Functions and Their Connections as Used in this Research
........................................................................................................................................... 94
Figure B. 7: Requested Data Classified from Cluster of Real Time Orientation Data ..... 95
Figure B. 8: Unbundled By Name Function that Returns Cluster Elements Whose Names
Figure B. 9: Greater? Function Returns True If x Is Greater than Y - This Function Was
Figure B. 10: Tick Count Function That Returns the Value of a Timer – This Function
Figure B. 11: Build Array Function to Store Activity Durations in a Numerical Array... 96
Figure B. 12: Statistics Tool Returns the Specified Statistical Characteristics of Input
Arrays ................................................................................................................................ 97
xii
LIST OF TABLES
jobsites .............................................................................................................................. 12
Table 6.2: Comparison between Estimated Durations and Actual Durations Based on
xiii
CHAPTER 1: INTRODUCTION
The efficiency of various construction tasks including the planning and control of
in real time, analyzed, and effectively integrated into the decision-making process. This
real time filed data stream can be used as a reliable source to modify project plans,
validate and improve existing control metrics, and update the underlying parameters of
computer models (e.g. simulation and visualization) describing the interactions between
different project resources, all in an effort to assist project personnel in predicting the
Resource planning and control at the operations level are critical components of
comprehensive operations level plan can help project decision-makers and site personnel
foresee potential problems such as spatial conflicts and resource underutilization even
before the actual operation takes place. This will also help save effort that would have
otherwise been put on reworks, resolving conflicts, and performing change orders, which
1
will ultimately translate into significant savings in project time and cost. For example,
Cox et al. [2] suggested that rework is typically responsible for 6-12% of the overall
Task Force (DART) reported that annually, more than $60 billion was spent on change
orders in the United States [3]. Also, according to the Federal Facilities Council (FFC), in
10-30% of all construction projects serious disputes are estimated to arise with a total
cost of resolution between $4-12 billion each year [4]. One of the major impediments of
specifications, and work schedules. This will become even more sophisticated when the
dynamics of the construction project creates several layers of uncertainty that can range
from internal factors (e.g. project time and cost variations, equipment breakdowns,
contractor claims) to external events (e.g. weather conditions, financial market stability).
Computer applications have thus evolved during the past several years to facilitate the
process of project planning by providing a convenient and reliable means for modeling,
simulating, and visualizing project activities [5, 6, 7, 8, 9, 10, 11]. In order to create
reliable computer models of a future construction project during the planning stage, one
needs to carefully examine every detail of the operations within that project, and identify
major events and processes that will potentially impact the outcome of each operation.
Once such events and processes are identified, attributes such as resource consumption
levels and activity durations should be determined. For a small operation, this can be
done in a relatively short period of time using existing numerical tools and statistical data
2
from past projects. However, as the size of the operation increases and with the
realistically represents the actual operation becomes a tedious if not an impossible task
[12]. This is mainly due to the fact that collecting accurate and reliable field data from
ongoing activities and resource operations, and integrating the collected data into the
planning process turns into a challenging task. In addition, the uncertainties caused by
unforeseen site conditions, equipment breakdowns, work delays, and the evolving nature
of a construction project may slow down or interrupt the progress of data collection. Even
if all such data is collected, handling a large volume of information in a single platform
can prove to be time and labor intensive. As a result, it is very likely that the modeler
uses strict rules, simplifying assumptions, and rigid design parameters inside the model to
streamline the modeling process. These may seriously impact the accuracy of the model
in representing the dynamics of the project which will ultimately be detrimental to the
Traditional simulation paradigms employ static data and information available from
similar projects and operate under a given set of system design parameters (e.g. activity
that facilitates real time field data collection, most project decision-makers rely on readily
3
automation and information technology resulted in new approaches for collecting and
managing construction work data. In particular, automated tracking systems have been
evolved to collect necessary information about the position of construction resources for
different purposes [15, 16, 17]. Timely use of field data to determine the location and
status of resources (e.g. construction equipment and personnel) helps in describing the
context surrounding the operations and therefore is valuable for monitoring the workflow
of activities during these operations. Also, field data supports operational decisions and
helps predict the performance of a construction system based on the latest project status.
Another valuable implication of field data acquisition is the application of the collected
various operations on a construction site. Visualizing field data has been demonstrated to
have many applications such as maintenance crew training [18], safety management [19],
and damage prevention [20]. But from the point of view of planning, monitoring, and
control, 3D visualization not only does offer a convenient tool for decision-makers to get
a real insight of what is exactly happening in a jobsite (particularly for operations that are
hard to quantify or represent in a parametric model), but also is a of substantial value for
important because decision-makers often do not have the time and knowledge to confirm
the accuracy and validity of simulation models and thus do not usually rely on the results
obtained from such models [10]. In addition, visualization assists in investigating events
that are hard to be quantified in a definitive manner, but yet can affect the final outcome.
Examples of such events include work zone overcrowding due to simultaneous execution
4
of different trades in building construction, safety problems, and potential for physical
collisions.
The benefits of construction field data collection, simulation, and visualization have been
investigated in isolated cases in the past. However, the potential of these three promising
techniques when integrated in a single framework that facilitates the process of short-
term planning and control of construction projects in operations level has not yet been
explored [21]. Hence, the presented research is mainly motivated by this need and is
aimed to fill this gap by investigating the requirements and applicability of an integrated
framework that uses the paradigm of dynamic data-driven simulation to address the
problem of short-term operational level planning and control. The underlying concepts
The overall objective of this study is to design a framework for integrating field data
construction projects. In order to achieve this objective, the following research tasks were
Investigate the requirements and design a functional system to collect real time
5
Build data classification and analysis methods to provide orderly data and link
The following Chapters of this Thesis are shaped around the concepts, details, and
implementation of the research tasks listed above. This Thesis is divided into seven
Chapters. In particular:
gaps that motivated this research, the novel approach that this study adopts to
address the identified gaps, and the overall objective and tasks defined and
6
Chapter 3: Dynamic data-driven application system (DDDAS) – This Chapter
simulation and outlines its application in various field of science and engineering
technical aspects of the visualization toolkit that has been used in this research is
presented in this Chapter and it is shown that how the proposed methodology
Chapter 7: Conclusions and Future Work – A discussion about the identified gaps
in knowledge and the developed research methodology for addressing these gaps
is presented in this Chapter and future research for further development of the
7
CHAPTER 2: LITERATURE REVIEW
In Chapter 1, a general introduction to the research was presented and the motivation,
potential contributions, research objective, and project tasks were described in details.
The presented research aims to address the gaps identified in the current body of
current demands in the areas of automated data collection, and visualization and
conducted, in an effort to put the presented work into context and demonstrate its
research community.
Collecting accurate and reliable data is one of the most critical components of every
decision support system. Data captured manually using traditional onsite data collection
techniques can be outdated, inaccurate, or missing certain pieces [22, 23, 24, 25, 26].
their time on recording and analyzing filed generated data[23]. Saidi et al. [27] stated that
having accurate and updated information about the status of construction operations
8
remains an issue in the construction industry. As a result, automated data collection and
resource location tracking techniques have received credibility over the past several
years, as they facilitate processes including but not limited to resource management,
productivity analysis, quality control, and monitoring workflow processes. To this end,
work still needs to be done in order to take advantage of such technologies when
planning activities at early stages of a project where the scope of the work and the
Automated resource (personnel, equipment, materials) tracking has been the subject of
many studies in construction and facility management [15, 28, 29, 30, 31, 32]. Resource
location tracking applications use different techniques for indoor and outdoor
environments. A variety of outdoor and indoor location tracking technologies exist with
Frequency Identification (RFID), for example, has been increasingly used for tracking
purposes in construction jobsites. RFID systems use tags and a reader which sends radio
frequency signals to read data from the tags. One of the early attempts in using of RFID
in construction industry was made by Jaselskis et al. [33]. They proposed RFID for
tracking high-valued materials on construction jobsites. Song et al. [28] used RFID to
automate the task of tracking the delivery and receipt of fabricated pipe spools in lay
down yards and under shipping portals. Since RFID readers and tags do not require line-
of-sight, the readers can detect several tags at a time, and the tags can function properly
in harsh conditions. However, the short reading range which mostly is a function of the
communication frequency can be an obstacle for the use of RFID systems in large
9
construction sites [34]. Researchers have also used the Global Positioning System (GPS)
for its capability in tracking construction labor and equipment in outdoor environments
and construction sites [16, 35, 36]. GPS is an outdoor satellite-based worldwide radio-
end users [37]. To address the challenge faced by equipment operators who have limited
field view and depth perception when they control equipment remotely with video
cameras, Oloufa et al. [35] developed a system for collision detection and vehicle
tracking by using differential GPS, wireless, and web-based technologies. The most
important impediment in using GPS is that its functionality is to the most extent, limited
to outdoor environments since a clear line-of-sight between the satellites and the GPS
receiver is always needed. More recently, there have also been some attempts in
combining RFID with GPS technology [31, 38]. Jang et al. [15] introduced an Automated
overcome the drawbacks of GPS and RFID systems in terms of accuracy and cost.
Another technology that has been studied for automated tracking is Ultra Wide Band
(UWB). Teizer et al. [39] developed an UWB data collection tool for work zone safety
Satellite System (GNSS) data is not available, indoor positioning technologies are used.
RFID and UWB can be used in both indoor and outdoor environments. GPS, as stated
before, has generally developed only for outdoor environments. However, another
technology called indoor GPS has recently emerged which is not satellite-based [32].
Wireless Local Area Network (WLAN) is another technology used for indoor tracking
10
and localizations [32]. Inertial Navigation Systems (INS) such as accelerometers and
other systems such as Bluetooth, Infrared, and Ultrasonic are other examples of indoor
motion-based devices sense motion and its attributes such as velocity, acceleration, and
heading directions. For position sensing, inertial navigation systems (INS) or inertial
accelerometers, and magnetometers [41]. Using IMU the current state of the target in
terms of location, speed, and heading direction can be determined by using state
Behzadan et al. [42] developed an augmented reality (AR) hardware framework in which
they used orientation trackers capable of measuring compass heading using magnetic
field sensors.
based tracking has lately started to gain credibility among researchers. In a recent study,
Brilakis et al. [43] presented an automated framework for vision based tracking using two
cameras. Although this method seems to overcome the disadvantages of existing sensor-
based techniques such as limited coverage area and dependence on preinstalled tags on
the objects, it is still much costly and requires a more involved maintenance and
Table 1 suggests and to the author’s best knowledge, the application of real time data
11
collection for the purpose of planning and monitoring of equipment motions has not yet
been investigated.
Table 2.1:
UWB/Indoor
Khoury and Kamat 2009 Tracking Mobile Users
GPS/WLAN/
12
2.2 Simulation in Construction
Simulation is a valuable tool for effective construction planning and management mainly
simulation have one thing in common: almost all of them assume that when the
simulation model is created, sufficient data with adequate level of detail is readily
available mainly in form of historical records from similar projects or expert thoughts
and judgments (which may prove to be subjective). It is clear that providing such input,
there is almost no guarantee that the generated output reliably reflects the expected
performance of project entities, since the bulk of the data do not particularly belong to
that project.
simulation (DES) has gained a lot of interest by researchers since almost every
completed [44]. DES models provide an effective means to establish logical relationships
between activities within a project which compete over and make use of available and
often scarce resources. The introduction of CYCLONE [45], marked the beginning of a
new era in modern construction simulation research. CYCLONE aimed to simplify the
modeling of processes that are cyclic in nature. Subsequently, many attempts were made
13
INSIGHT [6] that enabled videotaping of field operations, and extracting and analyzing
videotaped data to obtain estimated values for the productivity of the system and its
MODSIM [46] capable of translating a simulation code to the C language for compilation
based construction modeling and simulation method called ABC [9] was developed.
A DES system called COOPS was introduced by Liu and Ioannou [7] which used object-
oriented design for simulation. Martinez and Ioannou [44] examined DES systems based
construction planning tool to track the performance of individual resources and handle
Real time simulation has been explored by researchers in several engineering and
scientific fields. For example, Hunter et al. [49] developed a simulation model based on
inflow data aggregated over a short time interval to create an accurate estimate of the
14
suggested a generic simulation platform for real time DES modeling in healthcare and
manufacturing applications. Also, a yard crane dispatching algorithm based on real time
data driven simulation was proposed by Guo et al. [51] to solve the problem of yard crane
job sequencing by minimizing the average vehicle waiting time. In the construction
domain, however, despite previous work in real time data collection and processing, very
limited amount of research has been done in effectively incorporating field data into an
existing simulation model for short-term planning and control of the same operations.
Chung et al. [52] suggested using Bayesian techniques to update the distributions of input
parameters for tunnel simulation by “manually” collecting project data from a tunneling
project on a bi-weekly basis and using the collected data to improve simulation input
models. Also, Song et al. [12] described a framework of real time simulation for short-
term scheduling of heavy construction operations and developed a prototype system for
To this date, only a limited number of previous projects investigated the planning and
control of engineering systems through real time simulation using the latest changes in
activity patterns and interactions. In the absence of a simulation system that is not using
an accurate input data, the resulting output should be evaluated with prudence. Abourizk
et al. [53] discussed that random input tends to propagate to the output of the simulation
model. They warned of using improper molding of input data through demonstration of
the sensitivity of the output parameters as well as resource utilization to the input model
utilized.
15
2.4 Visualization in Construction
The role of visualization in construction engineering and management has been generally
and materials) that result in a constructed facility has received a very little attention [54,
55]. Almost all of the efforts in this area concentrated on visualization of “simulated”
construction operations. Schematic modeling such as DISCO, iconic animation [56], and
2D system visualizations such as PROOF [57] are some of the first generation systems
intended for visualizing simulated construction operations. More recently, Kamat and
language.
Confirming the veracity and validity of the simulated construction operation is a major
validation of the simulation model can be conveniently performed if a similar, yet pre-
processed animation representing the actual ongoing activities exists. Having both pre-
16
and post-processed animations in a similar visualization environment side by side,
facilitates the comparison between the real world systems and the model.
17
CHAPTER 3: DYNAMIC DATA-DRIVEN APPLICATION SYSTEM
(DDDAS)
3.1 Overview
system capable of offering real time analysis of concurrent construction operations is the
performance by using the incoming data streams to simulate the actual operations. To
achieve this, the concept of a relatively new simulation paradigm often referred to as
dynamic data-driven application system (DDDAS) and its potentials in the realm of
information layer, integrate the collected data with the corresponding simulation model to
constantly adapt the model to the dynamics of the construction system, and constantly
update it based on the latest collected operational data [58]. Although the dynamic nature
collected data into the simulation model in response to the evolving conditions, many
computational models used to date only allow fixed data inputs while the simulation is
launched [14].
Initially, DDDAS was conceived by the National Science Foundation (NSF) in 2000
following two catastrophic events. The first was the missed prediction of the track and
18
magnitude of a storm that blanketed a number of cities from South Carolina to New
England in January 2000, and the other was the failure of a simulation model to predict
the propagation and behaviors of a fire near Los Alamos National Laboratory in May
2000 mainly due to the changing nature of fire and consequently, the inability of
emergency response agencies to take appropriate actions to limit its propagation [59].
Scientists believed that such miscalculations were due to computer simulation models
that were unable to incorporate real time changing conditions on the ground [59].
modeling provided the necessary tools for accurate measurement and injection of
necessary data into corresponding simulation models and enabled the development of the
DDDAS. Figure 3.1 is a schematic diagram showing the basic components of a DDDAS
(as introduced by the NSF) consisting of the following modules: data acquisition tools,
simulation model, dynamic data control and acquisition, and visualization and human
interface.
19
Data acquisition tools refer to field equipment used for remote data collection such as
wireless sensors and instruments. Simulation model represents those models that need to
be updated based on the stream of the incoming data. Dynamic data control and
acquisition includes algorithms for data analysis used to prepare data for representation
and input modeling. Finally, visualization and human interface refers to the human expert
interaction to steer the model (if needed) and determine answers to critical decision-
making problems based on the simulation results. These components and their
concept and most of the platforms, including what was developed in this research are
are yet to benefit from the opportunity offered through employing this concept.
In a research aimed at forecasting the wildfire behavior, Mandel et al. [60] proposed a
DDDAS that included coupled weather and fire numerical models, an automated data
acquisition and control module, visualization and user interface module, and a
module directs data to the numerical models where multiple simulations are running.
20
Synchronously, the simulation inputs are adjusted based on the actual measurements of
the field. Also, simulation results are presented through visualization and user interface
example, data collection was performed using wireless network sensors and cameras
mounted on airplanes. Personal digital assistant (PDA) devices were also used as
convenient visualization and user interface tools while numerical model ran on a remote
supercomputer. Figure 3.2 shows how this particular application has been built upon the
Figure 3.2: Mandel et al. Developed DDDAS for real time modeling of Wildfire
environmental engineering set up. They considered the case of a contaminant spill
occurring near a clear water aquifer. Sensors were used to measure where the
contaminant was, how and in what direction it was moving, and to monitor the
interpolation were used in order to map sensor data and to continuously update the
21
simulation model. The study demonstrated that frequent updating of the sensor data in the
Gaylor et al. [62] indicated that in case of a crisis, management should make decisions in
order to react to dynamic uncertain conditions. In this regard, having access to real time
data in a format, that can be readily understood and acted upon, is critical. Therefore,
they applied the concept of DDDAS to support emergency medical treatment decisions in
crisis conditions. Their complex dynamic environment fed and responded to a stream of
real time data coming including positional data coming from GPS trackers mounted on
The NSF has also proposed some applications in workshops held for introducing
DDDAS. An interesting DDDAS application is traffic light control, since there are
always two significant variants: whether the plan is to minimize or to maximize the
number of red lights encountered. As stated by NSF 2000 [59], the ultimate goal should
be to continuously optimize the timing of the traffic lights. Using DDDAS and based on a
sophisticated model, data generated by sensors embedded under streets and also other
factors such as weather conditions can assist in predicting and optimizing the flow of
Unlike several other scientific fields, the idea of DDDAS has been given very little
attention in engineering simulation in general, and has not been widely applied to
22
construction research in particular. DDDAS enables a more accurate prediction of how a
dynamic construction system will behave in the future based on the current status of its
constituents (i.e. resources). Therefore, construction projects can benefit from this novel
Traditionally, there has been a major disconnect between DES modeling (which is mainly
conducted at the planning stage) and the actual site dynamics (during the construction
phase). Incorporating the concept of DDDAS into the modeling process can help
significantly improve conventional DES modeling. For example, more realistic activity
measuring data collected from different pieces of equipment involved in that activity. In
short, DDDAS facilitates the process of tailoring an existing DES model to better meet
the evolving conditions of the real system using the latest data as input to the
The DDDAS technique designed and implemented in this research captures sensor-based
real time data from resources on a jobsite, classifies and analyzes the collected data to a
meaningful format for the following modules, incorporates the analyzed data to update
the corresponding DES model, and creates an exact dynamic 3D visualization of the
ongoing operations using the collected data, all in an effort to assist project decision-
makers in short-term operations planning and control [63]. Figure 3.3 illustrates a
23
Figure 3.3: Developed DDDAS in the Presented Research
As shown in this Figure, the framework built upon the general concept of DDDAS. Real
time collected data from ongoing construction operation move through a data analysis
module to provide required information for updating the data-driven simulation model.
serves as the human interface module. Detailed description and system architecture of the
24
CHAPTER 4: VISUALIZATION WITH OPENSCENEGRAPH (OSG)
4.1 Overview
In this research, OpenSceneGraph (OSG) which is built upon the industry standard
OpenGL graphics library is used inside the .NET environment to create pre-processed
animations of ongoing equipment activities and to link each and every object motion
inside the animation to the collected field data that represent the actual motion of that
object. This Chapter provides technical details about the algorithms developed to create
provides essential means for creating a contextual animation. To facilitate the creation
scenes, the concept of scene graphs were implemented in this research. Generally, a scene
that together construct a scene [64]. Computer graphics implementations build upon the
concept of scene graphs release the end user from implementing and optimizing low level
scene [41]. The scene graph application programming interface (API) provides a means
25
OSG is a collection of open-source libraries that provide scene management and graphics
rendering optimization functionality to applications. It has been written in ANSI C++ and
uses the industry standard OpenGL low-level graphics API [65]. Although there are a few
other scene graph-based libraries such as Performer, Open Inventor, and Java3D, this
research used OSG due to the fact that it is capable of reading various image file formats
the same time, OSG provides the functionalities required to describe a complex scene
using an object-oriented representation which releases the user from implementing and
optimizing low level graphical programming and facilitates rapid development of graphic
applications.
In OSG terminology, a node is an object that can be part of or entirely comprise a scene
graph. Each node as a collection of one or more values and methods compresses what is
required to be drawn. Root node is the highest level node to which all the elements of a
scene graph (directly or indirectly) are connected [66]. Each scene graph comprised of
nodes in a graph structure that are connected together via individual child-parent
relationships. The edges that connect the nodes describe a meaningful relationship that
exists between them. The root node is usually connected to intermediate grouping nodes
called internal or group nodes. These nodes commonly are responsible for 3D
(scaling). Leaf nodes are the lowest level nodes that contain the geometrical description
of the components and are located at the terminus of a branch [67]. Figure 4.1 shows a
sample scene graph in which Jobsite is the root node. Scene sub-graphs are created and
26
attached to the root node to complete the scene structure by encapsulating the entire
jobsite. In Figure 4.1, sub-graphs Truck, Excavator and Terrain are all child nodes of
Jobsite. Also, nodes Excavator and Truck have their own child nodes at the lowest level
of the hierarchy.
Using transformation nodes, each geometrical model is created in its own local
coordinate frame, stored as a leaf node in the scene graph, and appropriate placement of
the model in terms of position and orientation will be made inside the coordinate frame of
27
its parent node. Scene graph developers can manipulate the translation, rotation, and
The overall transformation of a child object relative to its parent node is obtained by
Where the first matrix shows the transformation of the child note with respect to its
parent node, the second, third, and fourth matrices, are the rotation about the local X, Y,
and Z axes, respectively, and the fifth matrix is the scale matrix. Considering a scene
consisting of a loader and a truck, Figure 4.2 shows the hierarchical scene graph and
28
Figure 4.2. Hierarchical Scene Graph and Relationships between Different Nodes
Using the concept of scene graphs, if the angle of rotation of a child node about the X
axis of its parent node is γ, the default value for this angular motion can be set to zero to
represent the initial rotation matrix of the child node relative to the parent node, as
follows,
1 0 0 0 1 0 0 0
0 cos sin 0 0 1 0 0
0
0 sin cos 0 0 0 1 0
0 0 0 1 0 0 0 1
If a motion sensor capable of detecting angular motions is connected to the real object
being represented by the child node in the scene graph, as soon as the rotation angle
about the local X axis (also called the pitch angle) changes due to a change in the real
29
object’s orientation, the sensor determines the change, the collected value is used to
update the value of γ, and consequently the above rotation matrix is updated. For
example, the truck bed shown in Figure 4.2 is rotated upward by 45˚ from its initial
orientation. When this change is detected, the new pitch angle is used to update the
1 0 0 0 1 0 0 0
0 cos sin 0 0 cos 45 sin 45 0
45
0 sin cos 0 0 sin 45 cos 45 0
0 0 0 1 0 0 0 1
This new rotation matrix will be then used to update the overall transformation matrix of
Parent
the child (bed) node relative to its parent (truck) node ( T Child ). The animation is updated
in each frame according to the overall transformation matrices of all the objects exist in
the scene.
view of the described scene. OSG provides several utilities to arrange desired viewpoints
from which the viewer can watch the scene. The position and orientation of a viewpoint
can be manipulated while the scene is being displayed to achieve the desired view of a
scene graph or different views of the same scene graph. Moreover, it is possible to set
several viewpoints with different positions and angles to have various views of the same
scene graph as if depicting a single scene with different cameras installed in distinct
30
places [68]. In OSG, viewpoint definition is independent from the actual scene graph
Creating an animation of the scene objects requires that a dynamic relationship between
scene graph components is first established. Such relationship can be obtained through
dynamically manipulating the values of the transformation matrices in the scene graph.
over time, a realistic animation showing exact movements of real world objects need a
smooth transition between discrete points with the passage of animation time. OSG
provides complex mechanisms to achieve this goal through constant monitoring and
updating of all moving objects using frame updating algorithms [66]. Technical details of
these algorithms are beyond the scope of this study but can be found in [64, 67].
Since OSG is a free open source toolkit, it allows access to code internals, thus providing
the opportunity to manipulate and modify the original content of the code to supplement
the rest of the framework developed in the .NET environment for the specific purposes of
this research.
Each of the articulated components depicted in the scene consists of separate parts
AutoCAD™ (.dxf), MicroStation™ (.dwg), and VRML (.wrl) that are stored in the user
31
the main origin of the scene, and also creating a meaningful child-parent relationship
between separate parts, each single component of the scene is created as a standalone
model that can be moved inside the animation as necessary. Also, the animation speed
Using the real time positional and orientation data from the sensors mounted on the target
objects (in real world), the developed algorithm stores the data in form of vectors as the
animation path for each solid, yet articulated entity in the scene. Thus, the animated scene
is capable of showing the actual movements of every real object using real time data
representing the translation and/or orientation of that object’s articulated parts, or the
object as a whole.
32
CHAPTER 5: DEVELOPED FRAMEWORK
In Chapters 3 and 4, the technologies, concepts, and tools necessary to address the
challenges described in Chapter 1 were discussed in detail. The need for the presented
Chapter 2. This Chapter outlines a detailed account of the individual components of the
methodology.
Figure 5.1 depicts the higher level system architecture of the developed framework in
which the relationship between major building elements, as well as an overall view of
how raw operational data flows through the system, is eventually transformed into a
described in Chapter 3, the framework is built around the concept of dynamic data-driven
application system (DDDAS) and thus contains major components (modules) that were
previously illustrated in Figure 3.3. The following Subsections provide more details
33
Figure 5.1: System Architecture of the Developed DDDAS Framework
Banks [14] summed up the simulation environment from a data collection point of view
by indicating that data are rarely readily available and data collection is one of the most
modeling system, the problem could even get more complicated since the system
requires real time field data collection and integration. As a result, data acquisition is one
of the most challenging and computing intensive parts of a DDDAS given that it is
almost impossible to manually collect real time data in large projects. Depending on the
acquire, communicate, and synchronize data from multiple sources may itself be a major
challenge. Real time data is used not only for updating and fine-tuning the model with the
34
latest changes occurring in the real system, but it also serves as the basis for model
uninterrupted flow of input data is needed to reflect the latest changes in the status of
activities and resources. Therefore, developing and implementing a robust and automated
necessary.
In many construction projects, resources are in constant motion. Examples include dump
trucks transferring soil from a cut area to a fill location, crews laying reinforced concrete
rebars on a floor slab, and a tower crane lifting steel sections from a flatbed truck. As a
result, from a modeling perspective, capturing these changes in resource (e.g. equipment,
personnel, and material) positions is necessary. In addition to the positional data, most
construction equipment (e.g. cranes, excavators, shovels, loaders) have hinged moving
parts and thus, collecting the angles of orientation for these parts is also essential in order
to describe their motions. Such data can be acquired using orientation sensors that capture
three angles of rotation (i.e. yaw, pitch, and roll). In the presented research, orientation
data are captured and transmitted to the system in order to simulate and animate the body
In the course of this study, the data collection procedure was developed in two different
environments. Since both the manufacturer sample algorithms for the data collection
device and the open source code visualization toolkit, OpenSceneGraph (OSG), were
35
creating an object-oriented platform in .NET environment. Later, due to the flexibility of
LabVIEW graphical programming for more sophisticated data analysis and processing
required in this research, a more efficient data collection procedure was developed in
LabVIEW using almost the same principles and algorithms originally created in .NET
environment. In this Section, first the overall functions and classes of the developed data
will be presented.
Serial is a standard device communication protocol used for transferring data to or from a
peripheral devices via computer serial ports [69]. In order to communicate with the data
collection devices used in this research, a serial port communication algorithm was
developed. A major factor in designing this algorithm was generalizability which in the
context of this research, is defined as the ability of the framework to communicate with a
variety of data collection devices without the need to significantly modify the
unique and has been tested with the PNI TCM Prime 3D orientation tracker, it benefits
from a generic structure that can be easily used to communicate with other data collection
devices that transmit data using the RS-232 protocol. RS-232 is a specification for serial
communication and is one of the most popular for sensor connections [69]. Since the
collected tracker data is in a binary format, the developed serial port communication
36
algorithm contains methods to decode the transmitted data and convert them to a
The developed algorithm uses the advantages of object oriented programing in Microsoft
Visual C++ .NET environment. Using serial communication libraries, the initial
communication with the port is established, the port is opened, data (i.e. three orientation
angles) is received through the port, and the port is closed at the end of the experiment.
The orientation data coming through different brands of orientation trackers follow
different data transmission standards. The orientation tracker sensor, PNI TCM Prime
module, utilizes a binary data transmission protocol to obtain and extract the tracker data
that is transmitted over an RS-232 interface. Each data packet contains a component
called Frame Type ID that describes the content of the packet. Based on this ID, the
packet may contain each of the 3D rotational angles as well as the current temperature
(ranging from -40 ºC to +85 ºC). These values are stored in the packet Payload [70]. The
datagram structure of the PNI TCM Prime module is shown in Figure 5.2.
Figure 5.2: Datagram Structure of the PNI TCM Prime Orientation Tracker
37
Using the binary data provides the system with the advantage of fast data transmission.
However, this will in turn make the communication very sensitive to data corruption. As
(CRC) is used to separate useful and corrupted binary data packets. CRC is applied to a
series of bytes and produces an integer result that can be used for error detection. After
data is received from the orientation tracker, the tracking application computes the CRC
value using the existing contents of the data packet and compares this value to the one
originally calculated when the packet was being constructed prior to transmission. If the
two values are not identical, the packet is considered as corrupted and will be disregarded
and the application waits for the next data packet. If the two values are equal, the data is
safe to be used and extracted into its components. Using a set of binary data manipulation
statements provided in the application programming interface (API) of the tracker device,
the numerical values for each of the orientation angles are obtained. The main
functionalities of the managed C++ class developed for acquiring orientation tracker data
38
Figure 5.3: C++ Orientation Tracker Serial Communication Class
The PRIME::Initialize() function is called first to open the serial port and set up the port
properties (e.g. baud rate, data bits). Then, PRIME::Control() extracts the Payload piece
by piece. The number of requested angles (up to three) should be defined in this function.
Based on the number of requested data pieces, this function will be called consecutive
times and each time sends a control command to the tracker. In response, the tracker
sends a single packet containing binary values of the requested angles. For example, if all
three orientation angles (yaw, pitch, and roll) are required, the function will be called
three times and in return, the tracker sends binary values of three angles. Next,
call to PRIME::CRC() to check if the received data is error-free. If the data is not
This function stores the numerical value of the required angles in numerical variables
which will later be used to construct and display the real time animation of moving parts.
Finally, the PRIME::Shutdown() class will close the port. The flowchart in Figure 5.4
shows major steps in acquiring orientation data using the PRIME class introduced in
39
Figure 5.3. More information about all other functions and classes developed using
The presented methodology for data collection takes advantages of LabVIEW graphical
essence, is a system design platform that enables automating data collection and
such as memory allocation and language syntax [71]. It was used in this research in order
to create a single platform that provides more control and flexibility as far as data
collection and analysis, and displaying the results in a highly interactive (i.e. visual)
environment are concerned. Figure 5.5 shows a sample snapshot of the LabVIEW
(VI) which consists of a graphical user interface (i.e. Front Panel) and a graphical code
(i.e. Block Diagram). Each node in a Block Diagram performs a specific task and is
connected to other nodes via wires. More information about LabVIEW graphical
41
Figure 5.5: A VI Consists of a Front Panel and a Block Diagram
42
In this research, a real time data acquisition VI was designed and implemented to
customize and append an instrument driver for the PNI TCM orientation tracker.. To
create an interface between the instrument driver and the data collection device, the NI
Virtual Instrumentation Software Architecture (VISA) API was used for serial
communication. VISA, basically provides users with the ability to open, configure (i.e.
setting baud rate, flow control, parity), write to and read from, and close any type of
interfaces such as GPIB, TCP/IP, Ethernet/LAN, IEEE 1394, USB, and serial, and handle
errors in a fast and easier way in comparison with the same functions developed in a text-
based programming environment (e.g. C++). Figure 5.6 shows a rudimentary structure of
As shown in this Figure, VISA Resource Name passes session information between
instrument driver and SubVIs and is a unique identifier reference to the data collection
device (e.g. COM1, COM2). VISA Open essentially opens a session to communicate with
43
the device specified by the VISA Resource Name and returns a session identifier that can
PRIME::Initialize() in the Figure 5.3. Serial Configuration, as stated before, sets the port
configuration parameters specific to the device such as baud rate which is a measurement
for communication speed equal to 38,400 HZ for the PNI TCM module. VISA Write has a
performance similar to PRIME::Control() in the Figure 5.3. It extracts the Payload and
requests needed angle measurements. Subsequently, VISA Read reads the requested data
based on what was defined in VISA Write. The CRC will be performed to detect
corrupted data packets by calculating the CRC-16 of the output string from the tracker
and comparing it to the checksum at the end of the output string. In essence, this
function PRIME::Shutdown(), VISA Close shuts down the port and terminates the
The advantage of real time automated data collection is that it enables the simulation
model to update itself in response to changes in the project environment. This can be
achieved by continually collecting time-stamped data. However, before the raw data
stream enters the simulation model or is used as input for visualization, it should be
classified, analyzed, and converted to a format that defines the state and the context of the
entity for which the data is collected. As such anda s shown in Figure 5.1, the raw data
collected using either the .NET or LabVIEW operational environments is passed onto the
data classification and analysis module of the developed DDDAS framework. The
44
5.3 Automated Data Classification and Analysis
that unnecessary data may also be inevitably collected. For example, in order to
three angular values namely yaw, pitch, and roll. However, given that the boom must be
raised or lowered to load or unload a truck, the main piece of information needed to
determine the start and end times of load or unload activities is the pitch angle. As such,
potential trembles resulting in small changes in the roll angle and also possible motions
such as sidewise movements and maneuvering of the loader leading to a change in the
yaw value are to the most extent, redundant as far as detection the beginning and end of
load and unload activities for the simulation model and having a smooth animation for
visualization are concerned. Therefore, collected data must be carefully classified so that
only relevant and useful information is passed onto the next steps.
Classified data also needs to be transformed into a proper format interpretable by the
using probabilistic distributions. Since discrete events mark the beginning and end of
detecting time-stamped events corresponding to the beginning and end of that activity.
Therefore, activity durations can be derived from the pool of classified collected raw data
and suitable probability distributions will be then fit to the calculated duration values. In
45
the earthmoving example described above, the angle of the boom and the truck bed
relative to the horizontal line can be used to identify the start and end points of load and
unload activities and determine activity durations. For example, in the operation depicted
in Figure 5.7, activity durations can be calculated by comparing the time stamps
corresponding to when each event (i.e. raise boom, load truck , lower boom, haul, raise
bed to dump, lower bed, return) occurs based on the orientation data (i.e. angles) received
An example of how a series of time-stamped data can be used to extract certain activities
46
Figure 5.8: Activity Durations Based on the Variation of Equipment Body Orientation
with Respect to Time
(RB = Raise Bucket, LT = Load Truck, LB = Lower Bucket, RTB = Raise Truck Bed, P = Put, LTB =
Lower Truck Bed)
In this Figure, the first diagram shows changes of angle α (loader boom angle relative to
the horizontal line) and the second diagram shows angle β (truck bed angle relative to the
horizontal line) over time. Considering angular variation histograms displayed in these
two diagrams, a timeline representing the duration of each activity can be generated. For
example, an increasing angle α and a constant angle β (close to zero) indicate that the
loader is raising its boom while the truck is waiting to be loaded (RB in Figure 5.8). A
near constantan angle α (close to its peak value) and a constant angle β (close to zero)
indicate that the loader is putting soil into the truck (LT in Figure 5.8). A decreasing
47
angle α and a constant angle β (close to zero) indicate that the loader is lowering its boom
while the truck is preparing to move (LB in Figure 5.8). An instance of “Load” activity is
completed when all three (RB, LT, and LB) processes are completed.
A similar analysis can be done to isolate instances of “Haul”, “Dump”, and “Return”
activities. For instance, given that angle α is constant (at a value close to zero), if angle β
is increasing from zero, the truck bed is being raised (RTB in Figure 5.8), if angle β is
almost constant (close to its peak value), soil is being dumped (P in Figure 5.8), and if
angle β is decreasing, the truck bed is being lowered (LTB in Figure 5.8). These three
processes, put together, will constitute an instance of “Dump” activity. Since histogram
data is time-stamped, duration values can be easily determined for all such instances.
durations to determine a distribution function that best represents the duration of that
activity. This distribution function is then used to describe the duration of that activity in
positional data in an indoor environment (e.g. laboratory setting where the components of
this framework was tested), a number of simplifying assumptions had to be made when
developing the methodology for extracting the duration of activities. For example, it was
assumed that the haul activity would not start until the loader lowers its boom and would
not finish until the truck raises its bed. Likewise, return activity starts when truck’s β
angle reaches zero and finishes at the beginning of the load activity, when the loader
48
starts raising its boom. It is clear that incorporating positional data into the proposed
algorithms for calculating activity durations not only does eliminate the need to consider
these and similar simplifying assumptions, but also enhances the accuracy of the
algorithm. As such, future work in this research will include activities specifically
The classification and analysis module accepts input from data collection devices, outputs
the classified data for pre-processed animation, and also analyzes data to be passed onto
the simulation model. This guarantees that only relevant data is used and that the
simulation model is not only the receiving end of the process but also can steer the data
categorize the activities based on the trend of the collected data and to remove the
outliers and eliminate the non-relevant data. The next Subsection describes the developed
In order to create a standalone platform consisting of both data collection and data
analysis modules, all mathematical and logical functions for data classification, extraction
of activity durations, and statistical analysis were appended to the same VI. Figure 5.9
49
Figure 5.9: Single VI Containing Data Acquisition and Data Analysis Functions
A cluster of real time orientation data enters the VISA interface and undergoes steps
depicted both in Figure 5.6 and by a dashed outline in Figure 5.9. This is followed by raw
data being classified and time-stamped. The time-stamped data will then be used to
calculate activity durations using several mathematical and logical commands built into
the VI. Statistical analysis will be also performed on the well-populated pool of duration
to calculate mean, standard deviation and other parameters required to describe activities
As far as the system architecture illustrated in Figure 5.1 is concerned, once the data is
available after the classification and analysis step, input parameters for simulation model
50
are determined. Construction operations can be broken down into and modeled as a
system of discrete activities which makes DES a viable method for simulating such
operations. One of the most commonly used DES systems is STROBOSCOPE [47].
programming language that enables users to make complex dynamic decisions and thus,
control the simulation at run-time. The advantage of STROBOSCOPE over many other
existing DES modeling platforms is that it considers the diversity of resources and their
characterizations. In addition, it has been built upon the concept of traditional activity
cycle diagram (ACD) which makes it suitable for modeling a large group of construction
operations that are cyclic in nature. STROBOSCOPE models are based on a graphical
In this Figure, SoilInPlace, LoadersWait, TrucksWait, and MovedSoil are queues where
resources wait before being drawn to activities (if needed). Also, Load is called a combi
51
activity since it immediately follows a queue, and Haul, Unload, and Return are normal
activities. In order for a STROBOSCOPE model to describe a real system, attributes such
as activity durations, number of entities, and resource capacities must be known. In the
absence of collected field data, assumptions and personal judgment is normally used to
quantify such parameters. As previously stated, one of the main motivations behind this
research was to investigate if further improvements can be made to the existing approach
script input file. All parameters pertinent to the characteristics of each model element
appropriate simulation parameters are determined from the collected data, the simulation
Chapter 4 illustrated a detailed description about OSG, the visualization toolkit used in
this study. Following the data flow illustrated in Figure 5.1, as soon as appropriate field
is created. This pre-processed (i.e. generated before data is fed into the simulation model)
animation can assist in detecting potential conflicts and enhancing safety and monitoring
of the project. The other benefit of this animation is that unlike many existing site
monitoring systems which mainly rely on video streaming, finding the best spots to
52
install cameras such that every action can be monitored with a free line-of-sight is no
longer an issue. This is due to the fact that once the animation is rendered on the screen,
the user has complete control over the viewpoints and can change their locations and
directions of look, if necessary. For example, the user can zoom in an out or navigate
around the animated scene to gain a better visual perspective of certain parts of the
operation since as stated in Chapter 4, OSG provides the opportunity to change the
In addition to the pre-processed animation, the results of the DES model can be used to
since very often, making decisions solely based on the textual output of conventional
simulation systems is time consuming and prone to unwanted biases and mistakes [54,
72]. However, in addition to the benefits that general visualization of simulation models
has, providing decision-makers with two identical animations, one based on the exact real
movements occurring on the jobsite and the other based on the output of the updated
simulation model with the latest data obtained from the field provides an extremely
convenient way to evaluate and compare different scenarios with the concurrent filed
configuration and make more realistic decisions. For example, since each construction
project is unique in terms of requirements and usage of its working space, having a real
time data from the project and evaluating different scenarios based on the transformation,
requirements, and limitations of the working space (e.g. maneuverability issues for
loading and dumping activates in earthmoving operations, visibility problems for the
53
crane operator in steel girder erection, safety problems and detecting potential collision,
assumptions based on historical data or their expert eye on the work. Hence, displaying
enables decision-makers to see first-hand how current trends on the jobsite (reflected in
the pre-processed animation) and the expected performance of resources (as displayed in
the post-processed animation) are related, and hence effectively serves this purpose.
Finally, another major advantage of having pre- and post-processed animations is that
comparing the two animations greatly facilitates the validation and verification of the
simulation model. In this case, the modeler can intuitively make sure whether the model
contains any modeling flaws or whether it performs as intended (i.e. verification). Also, it
can be visually determined by people who are not construction experts whether the
complicated task that has been previously studied by a number of researchers. A recent
simulation run. Based on the logged simulation model runtime data, VITASCOPE
animation commands in an ASCII text file [10]. While VITASCOPE is a great tool for
54
creating post-processed (simulation-based) animations of construction activities, the
existing visualization capabilities of the framework developed in this research enable the
streams, when simultaneously displayed, can facilitate the process of validation and
automated manner, and since the OSG visualization toolkit is written and extended using
the C++ programming environment, a middleware for linking LabVIEW (i.e. containing
data collection and analysis functionalities) to Microsoft Visual C++ (i.e. containing OSG
was used in this research. ActiveX has an interface that allows individual programs to be
Another building block of the framework as shown in Figure 5.1 is the “What-If
regarding the complex processes, different scenarios need to be assessed and the cost and
time associated with each scenario must be determined. For example, a decision
associated with the minimized expected cost [47]. Considering all possible configurations
55
in terms of crew sizes, number of equipment and their arrangements, operations logic,
and construction methods, a decision-maker may end up having to choose from several
mathematical comparisons or more complex optimization models, the engineer can then
determine the best configuration that satisfies the predefined criteria (e.g. objective
The last component of the system architecture presented in Figure 5.1 is Decision-
Making and Dynamic Feedback. The developed algorithms for data collection,
classification and analysis, simulation and visualization, will be best used in the presence
responsible for making the required modifications to the target construction process. As
stated earlier, presented data to the user contains two juxtaposed animations; one
identical to the actual process taking place in the jobsite (pre-processed), and the other,
resulted from simulating alternative scenarios (post-processed). Also, the user is provided
with the results of the simulation model and the output of the what-if analysis in order to
decide which alternative solution is the most appropriate. Therefore, not only by
performance attributes (e.g. productivity rates) from the simulation output and various
alternative scenarios, the decision-maker(s) will have the ability to further adjust future
processes. Ultimately, and due to the dynamic nature of construction projects, the cycle
56
presented in Figure 5.1 will repeat to reflect any further changes occurring in the process.
In other words, the next phase of data collection starts after expert modifications are
applied to the construction resources, and activities and a new set of data will be
classified, analyzed, simulated, and visualized. This guarantees that through continues
data collection from the equipment involved in a construction process, at any given time,
57
CHAPTER 6: LABORATORY SCALE EXPERIMENTS AND
RESULTS
Chapter 5 outlined the individual components of the developed framework and their
relationships in the context of the overall system. In this Chapter, results of preliminary
Florida are provided to demonstrate the validity and applicability of the developed
methodology and algorithms for data collection, data analysis, visualization, and data-
scale equipment operations scenarios designed and implemented to test certain aspects of
which the robustness, applicability, and overall functionality of the framework in terms of
following Subsections provide more insight about the details and outcomes of each of the
validation experiments.
58
A NetCam XL IP-addressable camera and a DellTM Precision T1500 desktop system were
also deployed. The camera was used to demonstrate the correctness and precision of the
pre-processed visualization and the desktop system was the main computing platform.
Figure 6.1 illustrates the overall arrangement of the tools and peripheral devices namely
the CEAP, model construction equipment, IP-addressable camera, and the computer
system.
In order to collect equipment motion data several PNI TCM 3D orientation trackers were
used. These modules were mounted on model construction equipment to capture and
59
transmit three angular values namely yaw (heading), pitch (tilt), and roll. Figure 6.2
shows a PNI TCM 3D orientation tracker mounted on a model excavator with definitions
of yaw, pitch, and roll angles. Also manufacturer’s specifications of this orientation
60
6.1.2 Single Object Data Collection and Visualization
Initially, a series of small-scale validation tests were conducted using data collected from
only one model construction equipment. Later, data collection, data analysis, and
visualization algorithms were modified to enable data capturing and processing from
multiple objects.
The first in a series of these experiments was conducted using an orientation tracker
mounted on a model loader [74]. Figure 6.3 shows the loader on the CEAP and the
The first step in conducting each experiment was to collect and classify equipment
motion data to provide necessary input for visualization and data-driven simulation. As
stated in Chapter 5, data collection and analysis is performed using LabVIEW. Figure 6.4
shows the Front Panel (i.e. user interface) of the data collection system for the validation
experiment using a single model loader. As shown in this Figure, the interface of the
developed VI enables a user to specify a communication port to receive data from the
61
orientation tracker, start and stop the data collection task, and view the numerical values
Once the VI is launched the data collection task begins. This is followed by a continuous
stream of real time classified angular data displayed in the three indicators designed to
show yaw, pitch, and roll values. These indicators are marked as YawLoader,
PitchLoader, and RollLoader in Figure 6.4, respectively. As soon as the user switches off
the data collection using the “Stop Data Collection” button the data stream stops. By
clicking on “Start Animation”, an animation of the exact same movements of the loader’s
boom will appear on the screen. Figure 6.5 shows snapshots from this test. In this Figure,
several frames of the live video streams of the real system captured using the IP-
62
addressable camera are displayed next to the corresponding 3D animation frames
Figure 6.5: Real Time Display of Loader's Boom Movements and Corresponding 3D
Animation Generated in Real Time
63
6.1.3 Double Object Data Collection and Visualization
In order to validate the generalizability of the overall framework and to demonstrate that
the developed methods will properly function in situations where operational data from
more than one piece of equipment has to be collected, the data collection algorithms in
were slightly modified. In doing so, the major issue that was successfully addressed was
to update the processes inside the data analysis module to be able to identify individual
activities from a large pool of raw motion data collected from several pieces of
equipment using multiple data collection devices (i.e. 3D orientation trackers), determine
consequently extract activity durations [63]. To validate the newly developed methods, a
laboratory-scale experiment was set up where operational data was collected from two
models, and the collected data was processed to generate a live 3D animation as well as
the calculate the main input parameters needed by the data-driven simulation module to
trackers were mounted on a model loader and a model truck. Figure 6.6 shows these
equipment placed on the CEAP while the orientation sensors are mounted on the loader’s
64
Figure 6.6: Orientation Trackers Mounted on a Loader's Boom and a Truck’s Bed
Similar to the validation experiment using only one object, a VI was created and
implemented for data collection. However, this time not only the three angular data are
shown, but also two diagrams containing a series of time-stamped data to extract activity
durations are illustrated in the Front Panel. Each of these histograms shows the how
incoming data collected from the orientation tracker changes over time. For example, the
trend of data corresponding to the loader’s boom indicates that the boom is first lowered
from its initial state, raised and remained in a steady state for some time, lowered again
and remained in a steady state for a while, and finally raised. By observing this data
trend, one can conclude that the loader was involved in a cycle of digging soil (boom
down position) followed by loading a truck (boom up position). Based on the collected
data and using the developed algorithms for detecting individual activities from a series
of angular data, mean and standard deviation of durations were calculated and displayed.
65
Figure 6.7: Front Panel of Data Collection VI for a Double Object Experiment
Similar to the scenario in which only one object was used, a real time stream of motion
data is captured and displayed in the specific indicators on the Front Panel.
Simultaneously, each activity is detected by the VI based on the existing data trends and
Block Diagram. Figure 6.8 illustrates a portion of the extensive Block Diagram
66
Figure 6.8: Partial View of the Extensive Block Diagram Developed in this Research
streaming. The calculated values are used to populate numerical arrays. The content of
each numerical array corresponding to a certain activity (e.g. load, dump) is evaluated in
real time using statistical methods to determine the mean and standard deviation of the
Once the data collection is stopped by the user, and the start animation command is
triggered, a 3D animation showing the exact same equipment movements appears on the
screen. Figure 6.9 shows snapshots from the live video stream of the real system as well
67
Figure 6.9: Real Time Display of Loader's Boom and Truck’s Bed Movements and
Corresponding Animations
68
6.2 Comprehensive Example: Data-Driven Simulation
In order to demonstrate the ability of the developed framework in supporting the prospect
data with simulation modeling, a simplified yet comprehensive operational scenario was
designed and carried out. In this experiment, the goal was to move 200 pieces of model
rocks from a loading area (i.e. Area #1) to a dumping site (i.e. Area 2) for a dam
construction project. A model loader was used to load a model truck. The truck would
haul the rocks from the loading area to the dumping site. It was assumed that pieces of
rock are so big and heavy that each truck can carry only one rock in each hauling cycle.
In order to collect field data, two orientation trackers were mounted on the model
equipment; one on the loader’s boom, and the other on the truck’s bed. Figure 6.10 shows
69
Figure 6.11 shows the DES network of this operation. In this Figure, RocksToMove,
LoadersWait, TrucksWait, and MovedRocks are queues, Load is a combi activity (i.e. it
immediately follows a queue), and Haul, Dump, and Return are normal activities. Also,
all network elements (i.e. activities and queues) are connected by links. Each link has a
specific name and can carry a certain type of resource (i.e. Rock, Loader, Truck) from
one element to the other. For example, RK2 is defined as a link connecting Load and
As stated before, during the planning stages of a project, simulation modelers generally
rely on expert judgments or field reports from similar past projects to determine model
parameters such as activity durations. Following the same logic and as shown in Figure
6.12, a DES script was initially created in STROBOSCOPE for the dam construction
70
scenario where activity durations were approximated based on the overall arrangement of
71
In this Figure, statements used to describe activity durations inside the simulation script
are highlighted. In addition, necessary statements were added to assess and report the
total completion time of the project. The output of this simulation model is shown in
Figure 6.13. In this Figure, average waiting time of resources inside their corresponding
queues is highlighted. Since the simulation parameter (i.e. activity durations) were
approximated in the first place, the resulting waiting times may or may not represent the
actual idle time of resources during the course of the real world project.
72
Hence, it was decided to incorporate real time operational data collected from the model
equipment into the DES model to create a more accurate and realistic output that better
serve the decision-making process. To do so, data was collected from several complete
operational cycles including Load, Haul, Dump, and Return activities. The collected data
was further processed to determine and display the statistical mean and the standard
relating equipment motions to the beginning and end events of individual activities.
Figure 6.14 shows the VI and the results obtained for activity durations.
These statistical parameters were used to replace the approximate duration values by
assigning more realistic Normal distributions to individual activity durations and update
the DES model. The revised STROBOSCOPE simulation script is shown in Figure 6.15
where newly calculated activity durations are highlighted. The updated simulation model
was then run and results were collected as illustrated in Figure 6.16.
73
Figure 6.14: Developed VI for Data Collection and Analysis for Rock Hauling Example
74
Figure 6.15: STROBOSCOPE Simulation Input File Containing Updated Activity
Durations
75
Figure 6.16: STROBOSCOPE Simulation Output File Based on the Updated Durations
Comparing the output of the revised simulation model (Figure 6.16) with that of the
original model (Figure 6.13), it is clearly seen that incorporating field data into the
simulation modelling has resulted the average waiting time of the loader to significantly
decrease from 72.21 seconds to 44.77 seconds. Also, the overall project completion time
durations. These improvements can potentially affect the outcome of the planning of
projects tasks scheduled for the immediate future tasks as far as resource arrangements
76
and combinations are concerned. Table 6.2 summarizes the results of this comparative
validation example.
Table 6.2:
Comparison between Estimated Durations and Actual Durations Based on Real Time
data
77
CHAPTER 7: CONCLUSIONS AND FUTURE WORK
7.1 Conclusions
Operations level planning and control is one of the most critical components of managing
ongoing activities in a construction site. Proper resource planning and control can
guarantee that the best possible arrangement of resources are deployed which will in turn,
result in substantial savings in project completion time and cost. To this end, simulation
modeling as a powerful tool for analyzing complex construction operations has gained
significant credibility during the past several years. Commonly, many simulation
real engineering systems. The suitability of this approach for modeling construction
operations, however, has always been a challenge since most construction projects are
unique in nature, and every project is different in design, specifications, methods, and
standards. Therefore, there is a significant need for a methodology that not only does
enable the modeling of main entities and logical relationships in a real system, but also
allows that real time changes be incorporated into the simulation model.
The major requirement of a modeling platform capable of precisely representing the real
world construction system is a data collection scheme capable of providing the simulation
model with the latest information about the status of underlying processes and project
entities. Given the dynamic nature and complexity of many construction processes,
78
model is a tedious if not an impossible task and thus, it is necessary to employ an
automated system for collecting required data and convert them to a format
This Thesis document reported on a study conducted to investigate the requirements and
paradigm was integrated with the traditional discrete-event simulation (DES) modeling to
create a single decision-making framework for short-term scheduling and system control.
The framework is capable of automatically collecting real time operational data from
construction equipment and subsequently sorting, analyzing, and using them to create
real time 3D animations of the concurrent construction processes, and also updating the
simulation model describing the real operations based on the latest trends in the data
along with a graphical programming and data collection platform, LabVIEW. To validate
were used to collect motion data from moving parts of model construction equipment,
and the collected data was analyzed and transformed into a format meaningful for the
79
The following summarizes the main milestones of this research that have been
successfully achieved:
A data collection platform was developed in LabVIEW for collecting angular data
from 3D orientation trackers that transmitted data over an RS-232 serial port
interface.
Data classification and analysis algorithms were developed in LabVIEW for real
time analysis of raw data and to convert them to proper format for use as input by
interoperability between the data collection and analysis module, the object-
oriented programming platform used for visualization, and the discrete event
80
7.2 Future Work
The presented research is part of a much larger ongoing project which aims to facilitate
the integration of real time operational data into the construction decision-making
process. The next step in developing the current system will contain communication
methods to capture Real Time Kinematics (RTK) GPS data for location tracking of
construction equipment and also the deployment of more efficient orientation trackers
that can adequately handle specific conditions of the jobsite in terms of communication
range, accuracy, and ambient noise. In addition to spatio-temporal data (i.e. position,
orientation), payload information is another potential source of data that can be collected
and used to determine the state of equipment involved in operations such as earthmoving
or steel erection where material is transported from one location to another. There are
also other types of data that are not necessarily related to construction resources but can
potentially affect the progress of field activities. Examples include weather-related (e.g.
temperature, humidity) data and soil and topography data. To this end, future work in this
research will include the design and implementation of robust algorithms to collect,
Also, there is a need to examine the developed pre-processed visualization module in the
the advantages and identify potential shortcomings of the current framework. In addition,
work needs to be done to improve the mathematical efficiency and statistical accuracy of
the framework in order to more effectively handle, fuse, and process large volumes of
81
raw incoming data especially when multiple heterogeneous data collection devices are
used.
Automating and optimizing equipment operations is another potential area for future
work in this research. To achieve this, machine learning methods will be investigated to
develop a self-learning system capable of observing activities that involve resource (i.e.
trends and cyclic motions, and subsequently generating knowledge-based action plans to
82
APPENDIX A: C++ ALGOTIRHMS FLOWCHARTS
83
As stated in Chapter 5, the computing platform developed in this research takes
flowcharts that describe how different programming modules communicate and what
type of data is transferred between these modules The illustrated flowcharts are only
intended to supplement the discussion of the topics introduced in previous Chapters and
to help interested readers gain a better understanding of the data flow in the developed
platform.
There are four major C++ functions used inside the .NET environment. These functions
include,
CreateAnimationPath()
This function plays the most critical role since it facilitates communication between C++
and LabVIEW to capture and store angular data as vectors and create and return an
animation path.
CreateMovingModel()
This function imports 3D CAD files of articulated parts of model construction equipment
84
constructing each object in the scene. This function creates and returns an intermediate or
CreateModel()
This function defines the origin of the coordination system used to create the
visualization scene. It also attaches the group node model to the root node and returns
Main()
This function initializes a LabVIEW interface, tilts the scene to arrive at the desired
viewpoint, set the scene to render, and finally runs the animation.
Figures a.1 through A.4 illustrates detailed flowcharts of the above function.
85
Figure A. 1: CreateAnimationPath() Function Flowchart
86
Figure A. 2: CreateMovingModel() Function Flowchart
87
Figure A. 3: CreateModel() Function Flowchart
88
Figure A. 4: Main() Function Flowchart
89
APPENDIX B: LabVIEW GRAPHICAL PROGRAMMING AND
ALGORITHMS
90
LabVIEW1 (i.e. Laboratory Virtual Instrument Engineering Workbench) is a product of
National Instrument (NI) and is a platform for designing engineering and scientific
measurement and control systems. LabVIEW uses graphical programming (G) as a data
through “wires”. This approach provides an efficient way of handling and processing data
based on a sequential line by line manner. LabVIEW has built-in tools designed
LabVIEW programs are called virtual instruments (VIs). Each VI has two windows: the
user interface which is called the Front Panel, and the graphical code called the Block
Diagram. The Front Panel provides users with interactive controls such as buttons, gages,
graphs, and tables as well as tools to save data files or automatically generating reports.
The Block Diagram, on the other hand, consists of icons and nodes that are connected
together via wires. Figure B.1 shows a customized Front Panel and the corresponding
Block Diagram.
1
LabVIEW is a registered trademark of National Instruments (NI).
91
Figure B. 1: A Customized VI - The Upper Window is the Front Panel and the Bottom
Window is the Block Diagram
92
Algorithms for data collection and data analysis components of the presented framework
were developed using LabVIEW. To this end, a Plug and Play (P&P) instrument driver
initially developed by the NI was modified, customized and appended to meet the
source well-documented libraries and can be customized by the end user to perform
specific tasks. In this research, an instrument driver was used for communication with
orientation trackers employed for data collection via RS-232 protocol for serial
communication.
I/O language and an application programming interface (API) for sensor programming
was used in this research. VISA basically facilitates port communication by providing
needed operations such as opening, writing to, reading from, and closing a port. Figures
B.2 through B.5 show special nodes in LabVIEW for each of the indicated tasks.
Figure B. 2: VISA Open Opens the Specified Port by the VISA Resource Name
93
Figure B. 3: VISA Write Writes Data to the Specified Port by the VISA Resource Name
Figure B. 4: VISA Read Reads Data from the Specified Port by the VISA Resource Name
Figure B. 5: VISA Close Closes the Specified Port by the VISA Resource Name
Also, Figure B.6 illustrates a simplified layout of how these VISA functions are
connected to each other via wires in the developed data collection system.
Figure B. 6: A Series of VISA Functions and Their Connections as Used in this Research
94
Data classification and analysis algorithms were also designed in the same VI. Figures
B.7 through B.12 show built-in functions that were used to configure a relatively
sophisticated graphical code capable of real time extraction of activity durations from
angular raw data. Necessary information has been provided in each Figure caption.
Figure B. 7: Requested Data Classified from Cluster of Real Time Orientation Data
Figure B. 8: Unbundled By Name Function that Returns Cluster Elements Whose Names
Have Been Specified
95
Figure B. 9: Greater? Function Returns True If x Is Greater than Y - This Function Was
Used to Detect Data Exceeding a Specified Threshold
Figure B. 10: Tick Count Function That Returns the Value of a Timer – This Function
Was Used to Measure the Duration of Each Activity
Figure B. 11: Build Array Function to Store Activity Durations in a Numerical Array
96
Figure B. 12: Statistics Tool Returns the Specified Statistical Characteristics of Input
Arrays
In addition to the VI elements described above, there are a number of other functions,
tools, and controls that were also developed and used for constructing the VI. For
example Case Structures, While Loops, or PointByPoint Analysis functions that each one
of which perform specific tasks under different conditions. The descriptions and technical
details of these elements are, however, beyond the scope of this document. Interested
readers are encouraged to contact the author or the DESIMAL research group at the
97
REFERENCES
[1] D.W. Halpin, L.S. Riggs, Planning and analysis of construction operations, Wiley,
New York, 1992.
[2] R.F. Cox, R.R.A. Issa, D. Aherns, Management's perception of key performance
indicators for construction, J. Constr. Eng. Manage. 129 (2) (2003) 142-151.
[3] W. Ibbs, Quantitative impacts of project change: size issues, J. Constr. Eng. Manage.
123 (3) (1997) 08-11.
[4] NRC, National Research Council (NRC), Federal Facilities Council Technical Report
No. 149 Reducing Construction Costs: Uses of Best Dispute Resolution Practices by
Project Owners, N.A. Press, Washington, DC, 2007.
[5] D.W. Halpin, CYCLONE - A Method for Modeling Job Site Processes, ASCE
Journal of Construction Division 103 (3) (1997) 489-499.
[6] B.C. Paulson, W.T. Chan, C.C. Koo, Construction operation simulation by
microcomputer, J. Constr. Eng. Manage. 113 (302) (1987) 302-314.
[8] J. Martinez, P.G. Ioannou, General purpose simulation with stroboscope, in:
Proceeding of the 1994 Winter Simulation Conference (WSC), Association for
Computing Machinery (ACM), Lake Buena Vista, FL, 1994, pp. 1159-1166.
[9] J. Shi, Activity-Based construction ABC modeling and simulation method, J. Constr.
Eng. Manage. 12 (354) (1999) 354-360.
[10] V.R. Kamat, J.C. Martinez, General-purpose 3D animation with VITASCOPE, in:
Proceeding of the 2004 Winter Simulation Conference, Washington, DC, 2004.
98
[11] A.H. Behzadan, V.R. Kamat, Automated generation of operations level construction
animations in outdoor augmented reality, J. Comput. Civil Eng. 23 (6) (2009) 405-417.
[12] L. Song, A framework for real-time simulation of heavy construction operations, in:
Proceeding of 2008 Winter Simulation Conference (WSC), Miami, FL, 2008, pp. 2387-
2395.
[13] G. Adkins, U.W. Pooch, Computer Simulation: a Tutorial, Computer 10 (4) (1977)
12-17.
[15] W.S. Jang, M.J. Skibniewski, Wireless sensor technologies for automated tracking
and monitoring of construction materials utilizing Zigbee networks, in: Proceeding of
ASCE Construction Research Congress, Grand Bahamas Island, 2007.
[16] A.H. Behzadan, Z. Aziz, C.J. Anumba, V.R. Kamat, Ubiquitous location tracking for
context-specific information delivery on construction sites, Autom. Constr. 17 (6) (2008)
737-748.
[20] A.H. Behzadan, V.R. Kamat, Interactive augmented reality visualization for
improved damage prevention and maintenance of underground infrastructure, in:
99
Proceeding of the 2009 Construction Research Congress, Seattle, WA, 2009, pp. 1214-
1222.
[24] G.S. Cheok, R.R. Lipman, C. Witzgall, J. Bernal, W.C. Stone, NIST Construction
Automation Program Rep. No: 4, Non-Intrusive Scanning Technology for Construction
Status Determination, Gaithersburg, Md, 2000.
[26] S. Taneja, A. Akcamete, B. Akinci, J.H. Garrett, E.W. East, L. Soibelman, Analysis
of Three Indoor Localization Technologies to Support Facility Management Field
Activities, in: Proceeding of International Conference on Computing in Civil and
Building Engineering, Nottingham, UK, 2010.
[27] K.S. Saidi, A.M. Lytle, W.C. Stone, Report of the NIST Workshop on Data
Exchange Standards at the Construction Job Site, in: Proceeding of 20th International
Symposium on Automation and Robotics in Construction (ISARC), Eindhoven, The
Netherlands, 2003, pp. 617-622.
[28] J. Song, C.T. Haas, C. Caldas, E. Ergen, B. Akinci, Automating the task of tracking
the delivery and receipt of fabricated pipe spools in industrial projects, Autom. Constr. 15
(2) (2006) 166-177.
[29] J. Song, C.T. Haas, C. Caldas, Tracking the location of materials on construction job
sites, J. Constr. Eng. Manage. 132 (9) (2006) 911-918.
100
[30] J. Teizer, M. Venugopal, A. Walia, Ultrawideband for Automated Real-Time Three-
Dimensional Location Sensing for Workforce, Equipment, and Material Positioning and
Tracking, Journal of Transportation Research Record 2081 (6) (2008) 56-64.
[31] D.T. Grau, C.H. Caldas, Methodology for automating the identification and
localization of construction components on industrial projects, J. Comput. Civil Eng. 23
(1) (2009) 3-13.
[32] H.M. Khoury, V.R. Kamat, Indoor User Localization for Context-Aware
Information Retrieval in Construction Projects, Autom. Constr. 18 (4) (2009) 444-457.
[33] E.J. Jaselskis, M.R. Anderson, C.T. Jahren, Y. Rodrigues, S. Njos, Radio Frequency
Identification Applications in Construction Industry, J. Constr. Eng. Manage. 121 (2)
(1995) 189-196.
[34] G.M. Jog, I.K. Brilakis, D.C. Angelides, Testing in harsh conditions: Tracking
resources on construction sites with machine vision, Autom. Constr. 20 (4) (2011) 328-
337.
[36] C.H. Caldas, D.T. Grau, C.T. Haas, Using global positioning system to improve
materials-locating processes on industrial projects, J. Constr. Eng. Manage. 132 (7)
(2006) 741-749.
101
[40] S. Razavi, O. Moselhi, Indoor Construction Location Sensing using Low Cost
Passive RFID Tags, in: Proceeding of 3rd International and 9th Construction Specialty
Conference, Ottawa, Canada, 2011, pp. CN1-10.
[43] I. Brilakis, M.W. Park, G. Jog, Automated vision tracking of project related entities,
Adv Eng Inform 25 (4) (2011) 713-724.
[44] J.C. Martinez, P.G. Ioannou, General-purpose systems for effective construction
simulation, J. Constr. Eng. Manage. 125 (4) (1999) 265-276.
[45] D.W. Halpin, An Investigation of the Use of Simulation Networks for Modeling
Construction Operations, Ph.D Dissertation, University of Illinois at Urbana-Champaign,
Illinois, 1973.
[47] J.C. Martinez, Stroboscope : state and resource based simulation of construction
processes, 1996.
[49] M.P. Hunter, R.M. Fujimoto, W. Suh, An investigation of real-time dynamic data
driven transportation simulation, in: Proceeding of the 2006 Winter Simulation
Conference, Monterey, CA, 2006, pp. 1414-1421.
102
[50] S. Tavakoli, A. Mousavi, A. Komashie, A generic framework for real-time discrete
event simulation (DES) modelling, in: Proceeding of 2008 Winter Simulation Conference
(WSC), Miami, FL, 2008, pp. 1931-1938.
[51] X. Guo, S.Y. Huang, W.J. Hsu, M.Y.H. Low, Yard crane dispatching based on real
time data driven Simulation for container Terminals, in: Proceeding of the 2008 Winter
Simulation Conference (WSC), Miami, FL, 2008, pp. 2648-2655.
[52] T.H. Chung, Y. Mohamed, S.M. AbouRizk, Bayesian updating application into
simulation in the North Edmonton Sanitary Trunk tunnel project, J. Constr. Eng. Manage.
132 (8) (2006) 882-894.
[53] S. AbouRizk, A. Hizaji, D.W. halpin, Effect of input modeling on simulation output,
in: Proceeding of Computing in Civil Engineering (New York), 1989.
[54] V.R. Kamat, Visualizing simulated construction operations in 3D, J. Comput. Civil
Eng. 15 (4) (2001) 329-327.
[55] Y. Kui, The ASDMCon project: the challenge of detecting defects on construction
sites, in: Proceeding of the Third International Symposium on 3D Data Processing,
Visualization, and Transmission (3DPVT'06), 2006.
[56] J.J. Shi, Iconic animation of construction simulation, in: Proceeding of the1999
Winter Conference on Simulation, 1999, pp. 992-997.
[58] F. Darema, Dynamic data driven applications systems: new capabilities for
application simulations and measurements, in: Proceeding of Computational Science --
ICCS 2005, Atlanta, USA, 2005, pp. 610-615.
[59] National Science Foundation (NSF), Dynamic Data Driven Application Systems
(DDDAS), 2000 NSF Sponsered Workshop Report,
http://www.nsf.gov/cise/cns/dddas/dd_das_work_shop_rprt.pdf, (accessed Sep. 29,
2011).
103
[60] J. Mandel, Towards a dynamic data driven application system for wildfire
simulation, in: Proceeding of Computational Science - ICCS 2005. 5th International
Conference. , 2005.
[62] M. Gaynor, A dynamic, data-driven, decision support system for emergency medical
services, in: Proceeding of Computational Science - ICCS 2005. 5th International
Conference., 2005.
[64] D.R. Nadeau, Visualizing stars and emission nebulas, Computer graphics forum 20
(1) (2001) 27-33.
[66] V.R. Kamat, J.C. Martinez, Scene graph and frame update algorithms for smooth
and scalable 3D visualization of simulated construction operations, Journal of Computer-
aided civil and infrastructure engineering 17 (4) (2002) 228-245.
[69] J.G. Proakis, M. Salehi, Digital communications, 5th ed., McGraw-Hill, Boston,
2008.
[70] PNI, User Manual, CompassPoint Prime, 3-Axis Electronic Compass Module, in:
PNI (Ed.), PNI Sensor Corporation, Santa Barbara, CA, 2011.
104
[71] National Instruments (NI) Website, http://www.ni.com/, (accessed Mar. 13, 2012).
[75] A. R. Pradhan, B. Akinci, An Approach for Fusing Data from Multiple Sources to
Support Construction Productivity Monitoring, in: Proceeding of International
Conference on Computing in Civil and Building Engineering, Nottinham, UK, 2010, pp.
105