0% found this document useful (0 votes)
48 views14 pages

2015 - A Formalism For Utilization of Autonomous Vision-Based Systems and Integrated Project Models For Construction Progress Monitoring

The document proposes a formal process for automating construction progress monitoring using visual data from camera-equipped drones and 4D BIM models. It outlines methods for identifying monitoring goals from BIM, autonomously collecting visual data with drones, generating 3D and 4D as-built models from images, analyzing progress deviations, and reporting metrics in an interactive web-based environment integrated with BIM.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
48 views14 pages

2015 - A Formalism For Utilization of Autonomous Vision-Based Systems and Integrated Project Models For Construction Progress Monitoring

The document proposes a formal process for automating construction progress monitoring using visual data from camera-equipped drones and 4D BIM models. It outlines methods for identifying monitoring goals from BIM, autonomously collecting visual data with drones, generating 3D and 4D as-built models from images, analyzing progress deviations, and reporting metrics in an interactive web-based environment integrated with BIM.
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 14

A Formalism for Utilization of Autonomous Vision-

Based Systems and Integrated Project Models for


Construction Progress Monitoring

Kevin K. Han
Department of Civil and Environmental Engineering
University of Illinois at Urbana-Champaign
205 N. Matthews Ave
Urbana, IL 61874
[email protected]

Jacob J. Lin
Department of Civil and Environmental Engineering
University of Illinois at Urbana-Champaign
205 N. Matthews Ave
Urbana, IL 61874
[email protected]

Mani Golparvar-Fard
Department of Civil and Environmental Engineering & Computer Science
University of Illinois at Urbana-Champaign
205 N. Matthews Ave
Urbana, IL 61874
[email protected]

ABSTRACT
Actual and potential progress deviations during construction are costly and preventable.
However, today’s monitoring programs cannot easily and quickly detect and manage
performance deviations. This is because (1) the current methods are based on manual
observations made at specific locations and times; and (2) the captured progress information is
not integrated with “as-planned” 4D Building Information Models (BIM). To facilitate this process,
construction companies have focused on collecting as-built visual data through hand-held
cameras and video recorders. They have also assigned field engineers to filter, annotate,
organize, and present the collected data in comparison to 4D BIM. However, cost and
complexity associated with collecting, analyzing and reporting operations still result in sparse
and infrequent monitoring and therefore a portion of the gains in efficiency is consumed by
monitoring costs. To address current limitations, this paper outlines a formal process for
automating construction progress monitoring using visual data captured via camera-equipped
Unmanned Aerial Vehicles (UAVs) and 4D BIM. More specifically, for data collection, formal
methods are proposed to identify monitoring goals for the UAVs using 4D BIM and
autonomously acquiring and updating the necessary visual data. For analytics, several methods
are proposed to generate 3D and 4D as-built point cloud models using the collected visual data,
to integrate them with BIM, and to automatically conduct appearance-based and geometrical
reasoning about progress deviations. For reporting, a method is proposed to characterize the
analyzed and identified progress deviations using performance metrics such as the Earned

Proceedings of the 2015 Conference on Autonomous and Robotic Construction of Infrastructure, Ames, Iowa. © 2015
by Iowa State University. The contents of this paper reflect the views of the author(s), who are responsible for the
facts and accuracy of the information presented herein.
Value Analysis (EVA) or Last Planner System (LPS) concepts. These metrics are then
visualized via color-coding the BIM elements in integrated project models, presented to project
personnel via a scalable and interactive web-based environment. The validation of this
formalism is discussed based on several real-world case studies.

Key words: progress monitoring—vision-based system—unmanned aerial vehicles—BIM

INTRODUCTION
Actual and potential progress deviations during construction of buildings and infrastructure
systems are costly but are preventable. To capture construction performance deviations
accurately and to make effective and prompt decisions based on the captured performance
deviations, frequent and accurate methods are needed for tracking construction progress. If
implemented on a daily basis, such methods can effectively bridge the gap in information
sharing between daily work execution and weekly work planning/coordination, ultimately leading
to improved project efficiencies (Bosché et al. 2014; Yang et al. 2015).

Today’s construction progress tracking methods, however, do not provide the accuracy and
frequency necessary for effective project controls. This is in large due to the labor intensive
processes involved with as-built data collection and analysis (Bae et al. 2014). It is common for
field engineers and superintendents to walk around the site, take photos, and document the
progress of ongoing operations, nevertheless this process is time consuming as it needs to be
conducted per daily task and for all associated construction elements. The ability to perform
monitoring and photographic documentation is also restricted by hard-to-reach areas. This
constraint negatively impacts the completeness of the data collection process. Once this data is
collected (whether it is complete or not), the field engineers filter, annotate, organize, and
present the collected data in comparison to 4D BIM. However, the cost and complexity
associated with the analysis and reporting operations result in sparse and infrequent monitoring
(Bae et al. 2013). Therefore through these activities for project controls, a portion of the gains in
efficiency is consumed by monitoring costs.

Over the past few years, research has focused on addressing current limitations through (1)
automatically generating 3D point cloud models from unstructured images and video sequences
(Brilakis et al. 2011; Golparvar-Fard et al. 2009), (2) aligning the resulting 3D point cloud models
with 4D BIM (Golparvar-Fard et al. 2011), and then (3) methods that can automatically infer the
status of work tasks and their relevant elements using geometry information (Golparvar-Fard et
al. 2012), appearance information (Han and Golparvar-Fard 2015), and via leveraging
formalized construction sequencing knowledge and reasoning mechanisms (Han and
Golparvar-Fard 2014a). While significant progress is achieved, yet many areas in research are
still open which require further investigation.

This paper presents a formal process for automating construction progress monitoring via
images taken with camera-equipped Unmanned Aerial Vehicles (UAVs) and 4D BIM. To
address the current limitations in accuracy and completeness of the data collection, we propose
to build on the emerging practice of using camera-equipped UAVs for collecting close-range site
imagery. Different from these practices, we propose to use 4D BIM to identify monitoring goals
for visual data collection, path planning, and autonomous navigation of the UAV through both
exterior and interior construction scenes. We also present a method that leverages BIM (1) to
improve accuracy and completeness of the 3D image-based point cloud modeling via UAV
captured images, and (2) for a better alignment of the images and the resulting point cloud

119
Han, Lin, Golparvar-Fard
models with the 4D BIM. For analytics, methods are presented to leverage geometry,
appearance, and inter-dependency information among elements together with formalized
construction sequencing knowledge to document progress for work tasks that have
corresponding physical elements. A new system architecture is also proposed for visualizing the
collected images, produced 3D point cloud models, and 4D BIM in a scalable web-based
environment. This environment allows user interactions which is particularly important for
collecting inputs on task constraints and those tasks that do not necessarily have physical
element correspondences. As shown in Figure 1, the resulting integrated project models can be
used to minimize the gap in information communication between work task coordination and
daily task execution. It can also support the identification and removal of work constraints and
root-cause analysis in construction coordination meetings.

Start Integrated Project Model


(Plan + As-built)
Yes, Performance Problem
Prepare Commit to Yes,
Master Weekly Monitor Progress Problems No Report Performance
Schedule Work Plan Work Progress / Deviations? Completed Task Problem

Look- Coordinate Quality No Confirm Task


Inspect Task
Ahead Weekly Issues? Complete
Schedule Work Plan Pull Information

Constraint Analysis Lead and Conduct Work As-Planned

End
Weekly Work Planning Daily Work Execution, and
Master Scheduling
and Coordination Performance Monitoring and Controls

Figure 1. The planning, execution and monitoring of weekly work plans and how
integrated project models generated via images taken from camera-equipped UAVs and
BIM can improve current work flows.

In the following, we provide an overview on the state of the practice and research on using
camera-equipped UAVs and BIM for progress monitoring. Next, we propose a formalized
procedure for leveraging 4D BIM for enhanced data collection, analytics, and communication of
construction progress deviations. A discussion is also provided on how the proposed procedure
and its associated tools can improve current information sharing and enhance construction
coordination processes.

STATE OF PRACTICE AND RESEARCH ON USING CAMERA-EQUIPPED UAVS AND BIM


FOR CONSTRUCTION PROGRESS MONITORING
Data collection
Over the past two years, the application of camera-equipped UAVs for comprehensive visual
documentation of work-in-progress on construction sites has gained significant popularity
among the Architecture/Engineering/Construction and Facility Management (AEC/ FM)
practitioners. Many construction and owner companies have procured their own UAV platforms
for frequent data collection. Several companies have also emerged that provide close range
aerial image data collection as a service. Ideally these UAVs should operate on sites on a daily
basis such that an accurate and complete visual documentation of work in progress can be
achieved.

120
Han, Lin, Golparvar-Fard
To operate autonomously, the UAV operators manually place waypoints on 2D maps and
leverage GPS coordinates for path planning and navigation purposes. The reliance on GPS for
navigation limits autonomous data collection to outdoors and causes major difficulties in dense
urban areas. Also the presence of steel components on sites affect the accuracy of the
magnetometers used on these platforms for navigation purposes, potentially causing safety
issues on construction sites. Placing waypoints manually can also create potential safety
hazards as the 2D maps used for navigation do not reflect the most updated status of the
resources on site, and leave the UAV operators to approximate the 2D location and height of
the new construction and resources such as mobile cranes.

Because the locations where construction progress is expected on project sites are not known
in advance, current best practices require the UAVs to cover the entirety of a site. Since current
batteries are limited to 15-35min operations, covering the entirety of a project site needs
multiple flights and manual supervision by the UAV operators. With the number of images
increasing, the computation time necessary for processing these images, generating 3D point
cloud models, and analyzing work-in-progress also exponentially grows.

While research (Lin et al. 2015a; Siebert and Teizer 2014; Zollmann et al. 2014) has focuses on
leveraging UAV-based images for progress monitoring, the aforementioned challenges are still
remained unexplored. Figure 2 (a) shows a camera-equipped UAV flying onsite, (b) an operator
using the controller to navigate the UAV for data collection and (c) a commercially available
application (DJI Ground Station 2015) for setting the waypoints for UAV navigation.

Figure 2. (a) camera-equipped UAV (b) operator executing data collection task on the
construction site, (c) waypoints setting in commercially available application (DJI Ground
Station 2015).

Visual Analytics for Progress Tracking


Today, there are two dominant practices for leveraging images collected from camera-equipped
UAVs for tracking work in progress:

(1) Generating large panoramic images of the site and superimposing these large-scale high
resolution images over existing maps– While these images provide excellent visuals to ongoing
operation, they lack 3D information to assist with area-based and volumetric-based
measurements necessary for progress monitoring. Also none of the current commercially
available platforms provide a mechanism to communicate who is working on which tasks at
what location.

(2) Producing 3D point cloud models– Over the past decade, the state-of-the-art in image-based
3D modeling methods from computer vision domain has significantly advanced. These

121
Han, Lin, Golparvar-Fard
developments have led to several commercially available platforms that can automatically
produce 3D point cloud models from collections of overlapping images. Several AEC/FM firms
have started to leverage these platforms to produce as-built 3D point cloud models of their
project sites via images taken from camera-equipped UAVs. Nevertheless, today’s practices are
mainly limited to measuring excavation work and stockpiles. This is because highly overlaying
images taken with a top-down view can produce high quality 3D models of these operations.
However creating complete 3D point cloud models for building and infrastructure systems also
requires the UAVs to fly around the structure to capture work in progress. Because there is not
automated mechanism for identifying most informative viewpoints, often the produced 3D point
cloud models are incomplete. Also the state of the art Structure from Motion (SfM) techniques
for image-based 3D reconstruction –as used in (Golparvar-Fard et al. 2012; Golparvar-Fard et
al. 2011) - may distort angles and distances in generated point cloud models. Figure 3 shows an
example of a point cloud model that was generate via highly overlapping images taken around
the perimeter of a construction site. The reconstructed 3D point cloud is up-to-scale and exhibits
problems in completeness and accuracy.

Figure 3. Typical challenges in using Standard SfM techniques for image-based 3D


reconstruction: (a) the UAV’s flight path around the project site for data collection. Here
the location and orientation of the images taken are shown with small pyramids; (b) the
produced 3D point cloud model is incomplete; (c) distortions in angle and distance, and
(d) the reconstructed 3D point cloud is up to scale and is unit less.

Comparing Image-based 3D Point Clouds and BIM


Once the as-built 3D point cloud models are generated and are integrated with as-planned BIM,
the resulting integrated project models can be used to identify progress deviations. The state-of-
the-art methods for identifying these deviations mainly falls into two categories:

(1) Analyzing the physical occupancy of the as-built models: Research on occupancy based
assessment methods use 3D point cloud models and BIM to monitor whether or not structural
elements, Mechanical/Electrical/plumbing (MEP) components, or temporary resources such as
formwork and shoring are physically present in the as-built point cloud model. These methods

122
Han, Lin, Golparvar-Fard
are still challenged with the lack of enough details in (1) BIM and (2) work breakdown structure
of the schedule. The 4D BIM may not have a physical representation for all elements,
particularly for the temporary resources. Hence model-driven monitoring is not possible for all
elements. The lack of formalized construction sequencing knowledge such as steps in
placement of concrete elements (i.e. forming, reinforcing, placing, and stripping) in 4D BIM also
adds to the complexity of the assessments, since without knowing exactly when each element is
expected to be placed, identifying the most updated progress status is not possible.

(2) Analyzing the appearance of the changes of the as-built models: The latest research on
appearance-based assessment of the as-built models superimposes the BIM with the 3D point
cloud models and then back-projects the BIM elements to the images used to generating the
point cloud model. From these back projections, several squared shape image patches are
sampled and used to analyze the observed construction material. The most observed material
from these image patches is then used to infer the most updated status of progress for each
element in 4D BIM (Han and Golparvar-Fard 2015).

While significant advancement in research has been made, still applying these methods to a
full-scale projects requires (1) accounting for the lack of details in 4D BIM, (2) addressing as-
built visibility issues, (3) creating large-scale libraries of construction materials that could be
used for appearance-based monitoring purposes; and (4) methods that can jointly leverage
geometry, appearance, and interdependency information in BIM for monitoring purposes.

Visualization and Information Communication


Today’s platforms for visualizing and communicating work in progress can produce 2D
panoramas and 3D point cloud models (with or without BIM) for highlighting the work in
progress and deviations from plan. The current 2D interfaces build on panoramic images and
present a top-down view of project sites and ongoing operations. These project controls
interfaces also provide tools for annotating the imagery for monitoring work in progress and field
reporting purposes. However the provided functions are not sufficient for practical progress
monitoring and information communication. For example, none of the current interfaces envision
any workflow as to how the visuals can facilitate information sharing and communication.

The 3D interfaces also have a number of challenges that currently prevent their wide spread
application. These interfaces either show the BIM or the 3D point cloud models. Hence it is
difficult to differentiate and highlight the changes between isolated as-planned and as-built
models. Also, the measurement tools in these interfaces are only capable of basic interactive
operations such as 3D volumetric measurements from point cloud models. To facilitate
information sharing and communication, the integrated project models – produced by
superimposing BIM and the 3D point cloud models– should remain accessible by all onsite and
offsite practitioners. Without having access to scalable and interactive web-based systems that
can support such functionalities, it will be difficult to use the integrated project models to support
smooth flow of information among project participants.

Overall, there is a lack of an end-to-end formalized procedure that can take account for visual
as-built data collection, progress monitoring analytics, and visualization. In the following, a
formal procedure is presented that can address current limitations. The opportunities for further
research in each step of the procedure are discussed as well.

123
Han, Lin, Golparvar-Fard
FORMAL METHODS FOR AUTONOMOUS MONITORING OF CONSTRUCTION PROGRESS
This section proposes a procedure for autonomous vision-based monitoring of construction
progress, which consists of 1) data collection using camera-equipped UAVs, 2) vision-based
analytics and comparison with 4D BIM for reasoning about progress deviations, and 3)
performance analysis and visualization. Figure 4 illustrates these steps in detail.

Figure 4. A procedure for autonomous vision-based monitoring of construction progress

Data collection
Figure 5 shows two different procedures for collecting visual data using UAVs for construction
progress monitoring. In the first procedure, the images are directly used to produce 3D and 4D
point cloud models. In the procedure shown in Figure 5b, the 4D BIM is used to assist with the
data collection process. Since the 4D BIM entails information about where changes are
expected to happen on a construction site, it can serve as a great basis for identifying scanning
goals, path planning and navigation. This strategy can also potentially address the current
limitations associated with path planning and navigation of the UAVs in interior spaces and
dense urban areas. This is particularly important as current methods primarily rely on GPS for
UAV control purposes while at indoor scenes or dense urban areas, reliable GPS is not
accessible.

To support a complete and accurate image-based reconstruction – whether BIM is used for data
collection or not– images should be taken with an overlap of 60-70%+. This rule of thumb
guarantees detection of sufficient number of visual features in each image, which is typically
required for standard Structure form Motion procedures. Taking images with such overlaps
require adjusting and controlling the flight path and speed of the UAV. BIM guided data
collection process as discussed in (Lin et al. 2015b) will certainly require less number of images
and can more intelligently assist with choosing informative views necessary for progress
monitoring purposes.

(a) (b)

Figure 5. Two different alternatives for progress monitoring data collection

124
Han, Lin, Golparvar-Fard
Data analytics
Alignment of image and BIM is the first step in vision-based analytics for construction progress
monitoring. In early research (Golparvar-Fard et al. 2012), a pipeline of image-based
reconstruction procedures consisting of Structure from Motion and Multi View Stereo algorithms
were used to generate point clouds and then transform them into the BIM coordinate system.
However, the 3D point cloud models generated through these standard procedures are up to
scale. To recover the scale, a user-driven process is required to select at least three
correspondences between the BIM and the point cloud. Utilizing these correspondences
between the 3D point cloud model and BIM, the least square registration problem can be solved
for the 7 degrees-of-freedom (3 rotation, 3 translation, 1 uniform scale) to transform and scale
the point cloud to the BIM coordinate system. This manual process can be improved by
leveraging BIM as a priori and adopting a constrained-based procedure for image-based 3D
reconstruction which is shown in Figure 6 (Karsch et al. 2014). The results from preliminary
experiments in (Karsch et al. 2014) show that the accuracy and completeness of the image-
based 3D point clouds can be significantly improved with using BIM as a priori (Figure 7).

Figure 6. Leveraging BIM as a priori for as-built modeling purposes

Figure 7. BIM-assisted SfM: enhanced accuracy of overlaying BIM on site images.

125
Han, Lin, Golparvar-Fard
Geometry and appearance-based progress monitoring analysis is the next step of data
analytics. As discussed in section 2 current practices are suffering from challenges of geometry
and appearance-based progress monitoring methods. To formalize the utilization of integrated
project models for progress monitoring purposes and facilitating information flows, the following
practical challenges that should be addressed first:

(1) Lack of detail in as-planned models- For accurate progress monitoring analytics, the as-
planned model should contain a BIM with level of development of 400/450. It should also reflect
daily operational details within the work break down structure (WBS) of the schedule which is to
be integrated with BIM. Nevertheless, in today’s practice of project controls, daily operation-level
tasks are not often reflected in the WBS. Their corresponding elements such as scaffolding and
shoring are also not typically represented in BIM. Hence with using these models, it is not easy
to identify “who does which work at what location”, especially for works related to temporary
structures and for detecting both geometry and appearance based changes. Formalizing
knowledge of construction sequencing and then enhancing the BIM with relevant reasoning
mechanism, can enhance current progress monitoring methods. Figure 8 illustrates an example
of this issue. As can be seen in the figure, the rebars and formwork should have been present in
the as-planned models for accurate assessment of work-in-progress on placement a concrete
foundation wall.

Figure 8. The necessary LoD in BIM that can enable comparison between as-built and as-
planned for progress monitoring (Han et al. 2015)

Figure 9. An example on how formalized knowledge of construction sequencing can


resolve limited visibility issues

(2) Limited visibility- Although taking images from different viewpoints – assisted by the UAVs–
may reduce the challenges associated with no visibility to construction elements, yet any
progress monitoring method should still be able to account for progress reasoning based on

126
Han, Lin, Golparvar-Fard
partial element visibilities. Formalizing knowledge of construction sequencing and integrating
that into BIM through a reasoning mechanism can address this issue to some degree. The
preliminary study conducted in (Han and Golparvar-Fard 2014b) shows that such formalized
knowledge can enhance the performance of vision-based progress monitoring methods by five
to seven percent. Figure 9 illustrates a case with a visibility issue that can be resolved by this
reasoning mechanism.

Visualization and Information Communication


Developing a scalable and interactive web-based platform for visualizing images, point cloud
models, BIM, together with analytical tools– particularly considering the limited memory and
bandwidth on commodity smartphones and tablets– is challenging. The main requirements are:
(1) Interaction, presentation, and manipulation of large-scale point cloud models,
(2) Displaying integrated point cloud model with 4D BIM, and
(3) Offering quick and easy to use analytical and communication tools.

As a first step towards addressing these challenges, (Lin et al. 2015a) introduces a new web
platform that can visualize as-built vs as-planned models. This new platform allows users to
access integrated project models on smartphones and tablets. To present large-scale point
cloud models in a convenient and scalable manner considering the limited memory available in
commodity smartphones and tablets, the point cloud models are structured in form of nested
octrees similar to (Scheiblauer et al. 2015). Figure 10 shows the developed web platform for
visualizing as-built point clouds generated by camera-equipped UAV and as-planned BIM.
Figure 10b in particular shows an example of the nested octree.

This data structure subsamples the point cloud and only shows relevant points depending on
the user viewpoint. Also the number of points projected to the same pixel on the screen will be
reduced according to the level of details chosen by the user. Loading a point cloud with density
of 10 million points takes only approximate to 2 seconds on a standard commodity smartphone.
The manipulation method can also be adjusted to first person or third person view controls. The
platform is built with BIMServer (Beetz et al. 2010) to integrate 4D BIM (Figure.10 g-h) for
visualization and information retrieval purposes. The semantic information necessary for
progress monitoring (e.g. expected construction materials and element inter-dependency
information) can be queried from BIM and presented through the integrated platform. To assist
with documenting work in progress for those tasks that do not have any geometrical
representation, several new tools are also created that allow the point cloud model to be directly
color coded and semantics such as “who does which work at what location” to be accounted for
based on various locations. This location-based method allows the issues associated with level
of detail in BIM to be also addressed, since now the users can push and pull information from
any user-annotated location (with or without a BIM representation). Integrating various
workflows such as quality control as shown in Figure 1 is part of ongoing research.

DISCUSSION
The integrated project model presented in the previous section can bridge the gap in information
sharing between the downstream feedback (onsite activities) and the short-term planning
(coordination meeting). By providing a near real-time visualization of ongoing work, particularly
who does which work at what location, it allows the most updated status of work in progress to
be communicated among all parties on and offsite. The platform shows potential in supporting
easy and quick identification of potential performance problems. It can also support root-cause
analysis discussions in coordination meetings, ultimately leading to a more smooth flow of
production in construction. Instead of measuring and communicating retrospective progress

127
Han, Lin, Golparvar-Fard
metrics such as the EVA or Percentage Plan Complete (PPC) metrics, intuitive and data-driven
communication of work in progress can allow for measuring progress based on more
informative metrics such as task maturity or Task Anticipated (TA) and Tasks-Made-Ready
(TMR) (Hamzeh et al. 2015).

Figure 10. Web platform for visualizing as-built point clouds generated by camera-
equipped UAV and as-planned BIM. (a) as-built point cloud; (b) the nested octree for pre-
processing and visualizing point cloud in a scalable manner; (c) the location and
orientation of the UAV-mounted camera when images were taken derived from the
Structure from Motion algorithm; (d) viewing the point cloud models through one of the
camera view points and texture mapping the frontal face of the camera frustum; (e) area
measurement tool which directly operated on the as-built point cloud; (f) color coding
part of the as-built point cloud for communicating who does which work at what location;
(g) and (h) integrated visualization of the as-built point cloud and the as-planned BIM for
two different construction projects.

128
Han, Lin, Golparvar-Fard
CONCLUSION AND FUTURE WORK
This paper presents a formalized procedure via utilizing images taken by camera-equipped
UAVs and BIM for generating integrated project models for construction progress monitoring
purposes. The various aspects of data collection, analysis, and communication as it pertains to
these integrated project models were discussed. These integrated project models have potential
for enhanced information flow and improving situational awareness on construction projects.
They can also support identifying and removing work constraints in coordination meetings, and
measuring proactive metrics such as task maturity, Task Anticipated (TA), or Tasks-Made-
Ready (TMR). The current limitations in data collection and analysis were discussed as well.
Ongoing research is focused on addressing these limitations as well as conducting pilot projects
on using the proposed procedures for creating integrated project models and validating their
potential in smoothening information flow and improving situational awareness on construction
projects.

ACKNOWLEDGMENTS
This research is financially supported by the National Science Foundation (NSF) Grant CPS
#1446765. Any opinions, findings, and conclusions or recommendations expressed in this
material are those of the authors and do not necessarily reflect the views of the National
Science Foundation. The technical support of the industry partners in providing access to their
sites for data collection and assisting with progress monitoring analytics is also appreciated.

REFERENCES

Bae, H., Golparvar-Fard, M., and White, J. (2013). “High-precision vision-based mobile
augmented reality system for context-aware architectural, engineering, construction and
facility management (AEC/FM) applications.” Visualization in Eng, Springer, 1(1), 1-13.

Bae, H., Golparvar-Fard, M., and White, J. (2014). “Image-Based Localization and Content
Authoring in Structure-from-Motion Point Cloud Models for Real-Time Field Reporting
Applications.” Journal of Computing in Civil Engineering, DOI:
1023 .1061/(ASCE)CP.1943-5487.0000392, B4014008, 637–644.

Beetz, J., van Berlo, L., de Laat, R., and van den Helm, P. (2010). “bimserver. org–An Open
Source IFC Model Server.” Proceedings of the CIB W78 2010: 27th International
Conference–Cairo, Egypt, (Weise 2006), 16–18.

Bosché, F., Ahmed, M., Turkan, Y., Haas, C. T., and Haas, R. (2014). “The value of integrating
Scan-to-BIM and Scan-vs-BIM techniques for construction monitoring using laser scanning
and BIM: The case of cylindrical MEP components.” Automation in Construction, 49, 201-
213, Elsevier.

Brilakis, I., Fathi, H., and Rashidi, A. (2011). “Progressive 3D reconstruction of infrastructure
with videogrammetry.” Automation in Construction, Elsevier, 20(7), 884–895.

DJI Ground Station (2015). http://www.dji.com/info/releases/dji-released-2-4g-bluetooth-


datalink-ipad-ground-station, last accessed 5/18/2015.

129
Han, Lin, Golparvar-Fard
Golparvar-Fard, M., Pena-Mora, F., and Savarese, S. (2009). “D4AR- A 4-Dimensional
Augmented Reality model for automating construction progress data collection, processing
and communication.” Journal of ITCON, 14(1), 129–153.

Golparvar-Fard, M., Peña-Mora, F., and Savarese, S. (2011). “Integrated Sequential As-Built
and As-Planned Representation with Tools in Support of Decision-Making Tasks in the
AEC/FM Industry.” Journal of Construction Engineering and Management. 137(12), 1-21.

Golparvar-Fard, M., Peña-Mora, F., and Savarese, S. (2012). “Automated Progress Monitoring
Using Unordered Daily Construction Photographs and IFC-Based Building Information
Models.” Journal of Computing in Civil Engineering, 10.1061/(ASCE)CP.1943–
5487.0000205.

Hamzeh, F. R., Saab, I., Tommelein, I. D., and Ballard, G. (2015). “Understanding the role of
‘tasks anticipated’ in lookahead planning through simulation.” Automation in Construction,
49, Part A(0), 18–26.

Han, K., and Golparvar-Fard, M. (2014a). “Multi-Sample Image-Based Material Recognition and
Formalized Sequencing Knowledge for Operation-Level Construction Progress Monitoring.”
Computing in Civil and Building Engineering, I. Raymond and I. Flood, eds., American
Society of Civil Engineers, 364–372.

Han, K., and Golparvar-Fard, M. (2014b). “Multi-Sample Image-Based Material Recognition and
Formalized Sequencing Knowledge for Operation-Level Construction Progress Monitoring.”
International Conference on Computing in Civil and Building Engineering, 2014, 364–372.

Han, K., and Golparvar-Fard, M. (2015). “Appearance-based material classification for


monitoring of operation-level construction progress using 4D BIM and site photologs.”
Automation in Construction, Elsevier, 53, 44–57.

Han, K., Lin, J., and Golparvar-Fard, M. (2015). “Model-driven Collection of Visual Data using
UAVs for Automated Construction Progress Monitoring.” Int’l Conference for Computing in
Civil and Building Engineering 2015, Austin, TX June 21-23.

Karsch, K., Golparvar-Fard, M., and Forsyth, D. (2014). “ConstructAide: analyzing and
visualizing construction sites through photographs and building models.” ACM
Transactions on Graphics (TOG), ACM, 33(6), 176.

Lin, J., Han, K., Fukuchi, Y., Eda, M., and Golparvar-Fard, M. (2015b). “Model-Based Monitoring
of Work-in-Progress via Images Taken by Camera-Equipped UAV and BIM.” 2nd
International Conference on Civil and Building Engineering Informatics, Tokyo, Japan.

M. Golparvar-Fard, Pena-Mora, F., Savarese, S., Golparvar-Fard, M., Pena-Mora, F., and
Savarese, S. (2011). “Monitoring changes of 3D building elements from unordered photo
collections.” Computer Vision Workshops (ICCV Workshops), 2011 IEEE International
Conference on, 249–256.

Scheiblauer, C., Zimmermann, N., and Wimmer, M. (2015). “Workflow for Creating and
Rendering Huge Point Models.” Fundamentals of Virtual Archaeology: Theory and
Practice, A K Peters/CRC Press.

130
Han, Lin, Golparvar-Fard
Siebert, S., and Teizer, J. (2014). “Mobile 3D mapping for surveying earthwork projects using an
Unmanned Aerial Vehicle (UAV) system.” Automation in Construction, Elsevier, 41, 1–14.

Yang, J., Park, M.-W., Vela, P. A., and Golparvar-Fard, M. (2015). “Construction performance
monitoring via still images, time-lapse photos, and video streams: Now, tomorrow, and the
future.” Advanced Engineering Informatics, Elsevier. 29 (2), 211–224.

Zollmann, S., Hoppe, C., Kluckner, S., Poglitsch, C., Bischof, H., and Reitmayr, G. (2014).
“Augmented Reality for Construction Site Monitoring and Documentation.” Proceedings of
the IEEE, IEEE, 102(2), 137–154.

131
Han, Lin, Golparvar-Fard

You might also like