(Asce) 0733-9364 (1992) 118 1 PDF
(Asce) 0733-9364 (1992) 118 1 PDF
(Asce) 0733-9364 (1992) 118 1 PDF
INTRODUCTION
94
Research Tasks
The following research tasks were completed.
8
Define the project success criteria
» Define the ideal facility life-cycle functions
9 Develop structured questionnaire
• Select projects
<» Collect data from sites
• Analyze data
9
Develop guidelines for CPSFs
DEFINITION OF SUCCESS
success and failure in projects must before long decide what is meant by
success: for whom? using what criteria? and over what time period?" It is
in this area that most previous studies presented a limited view of project
success. If success was defined, it was often unclear from whose perspective,
or at what point in the project life cycle it was measured.
The literature has given several definitions of success, each of which is
specific to a particular project. We believe that each project participant will
have their own viewpoint of success. Success for a given project participant
is defined here as the degree to which project goals and expectations are
met. These goals and expectations may include technical, financial, edu-
cational, social, and professional aspects. These expectations may be dif-
ferent for many project participants, and are discussed below.
Success Criteria
Success criteria or a person's definition of success as it relates to a building
often changes from project to project depending on participants, scope of
services, project size, sophistication of the owner related to the design of
facilities, technological implications, and a variety of other factors. On the
other hand, common threads relating to success criteria often develop not
only with an individual project but across the industry as we relate success
to the perceptions and expectations of the owner, designer, or contractor.
Differences in a person's definition of success are often very evident.
To orient the researchers, lists of typical success criteria for the owner,
designer, and contractor were developed. Each list was developed by the
writers' reviewing the literature and then brainstorming and discussing suc-
cess criteria for the owners, designers, and contractors represented on the
project team. An unprioritized summary of these success criteria follows.
Owner's criteria for measuring success: on schedule; on budget; function
for intended use (satisfy users and customers); end result as envisioned;
quality (workmanship, products); aesthetically pleasing; return on invest-
ment (responsiveness to audiences); building must be marketable (image
and financial); and minimize aggravation in producing a building.
Designer's criteria for measuring success: satisfied client (obtain or de-
velop the potential to obtain repeat work); quality architectural product;
met design fee and profit goal; professional staff fulfillment (gain experience,
learn new skills); met project budget and schedule; marketable product/
process (selling tool, reputation with peers and clients); minimal construc-
tion problems (easy to operate, constructible design); no "ghosts," liability,
claims (building functions as intended); socially accepted (community re-
sponse); client pays (reliability); and well defined scope of work (contract
and scope and compensation match).
Contractor's criteria for measuring success: meet schedule (preconstruc-
tion, construction, design); profit; under budget (savings obtained for owner
and/or contractor); quality specification met or exceeded; no claims (owners,
subcontractors); safety; client satisfaction (personal relationships); good
subcontractor buy out; good direct communication (expectations of all par-
ties clearly defined); and minimal or no surprises during the project.
Common Criteria
While many criteria items or viewpoints are similar, there are several
distinctions that relate directly to the parties involved and the type of busi-
ness services they provide. For example, a priority item and one that appears
in all three lists (designer, owner, and contractor) in some form is the
96
financial reality of doing business. The owner wants the project completed
on time and on budget, and the designer and contractor both expect to meet
certain profit or fee goals. All three viewpoints also recognize the absence
of any legal claims or proceedings on a project as a desirable outcome. In
other words, this is a major criteria for measuring success. Another common
thread among the three groups involves meeting an appropriate schedule
as a way of measuring or determining if a project was successful.
Unique Criteria
It is also evident that there are some unique factors associated with each
of the three groups. The designer for instance is looking for a project that
will increase the level of professional development and professional satis-
faction among his employees. Safety is a high-priority issue for the contractor
that would not normally be an issue with the other two groups, because
their employees are at much less risk during the design or operation of a
building than the contractor's workers are during the construction of a
building. An owner is extremely interested in knowing that the building
project functions properly for the intended use and is free from long-term
defects or lingering maintenance problems.
As one would suspect, there is some variability even within the same firm
on the same project. The factors of importance range from meeting internal
budgets to professional satisfaction and on to producing a job that will help
the firm obtain repeat business or serve as a marketing tool for similar
projects with different clients. For example, two designers working on the
same project may view success differently. An experienced designer serving
as a project engineer may be concerned about meeting internal budget
criteria as well as meeting the client's needs. A less-experienced designer
working at a lower level of responsibility may consider the opportunity to
gain valuable design experience as a success criteria and be less concerned
about meeting the internal budget.
No single list will ever be totally comprehensive when it comes to a
definition of success for a project. The criteria developed for use with the
CPSF project does give a general overall impression of each of the three
groups viewpoints. It determines the "envelope" of ideas that are used to
evaluate success.
Those few things that must go well to ensure success for a manager
or organization, and therefore, they represent those managerial or
enterprise areas that must be given special and continual attention to
bring about high performance. Critical Success Factors include issues
vital to an organization's current operating activities and its future
success [Boynton and Zmund (1984)].
They are events or circumstances that require the special attention of
management because of their significance to the corporation. They
may be internal or external and be positive or negative in their impact.
Their essential character is the presence of a need from special aware-
97
FIG. 1. Product Control Elements and Their Relationships to Five Major Functions
in Model
98
ISub-C&tsgoryElement M P D C 0
Facility Plan c
oooooooo
CHANGES, Design Plan c
AND Construction Plan c
OBLIGATIONS Operation Plan c
Contract(s): Planning Contract c
Design Contract c
Construction Contract c
Operation Contract c
o o oo
1
Design Resources i
Construction Resources i
Operation Resources i
FIG. 2. Process Control Elements and Their Relationships to Five Major Func-
tions in Model
Sub-Category Element M P D C 0
OPTIMIZATION Designability Information c o
INFORMATION ConstructabUity Information c o.
Operability Information c 0
RESEARCH METHODS
Questionnaire
A questionnaire was developed to facilitate data collection by the re-
searchers and to ensure consistency in the elements examined. The ques-
tionnaire (see a sample page in Fig. 4) was separated by each major function,
viz management, planning, design, construction, and operation. Each of
these sets of questions has two parts. The first part determines the closeness
of fit of the functions as performed on the project with the ideal or modeled
functions just defined. There are nine major categories of questions. The
second part determines the degree of success that the responding participant
experienced on the given project.
The questions are thus classified by function and not by the people who
carry them out. For each project, the people who perform these functions
will be different depending on the contract type, the phase of the project,
local regulations and so on. Therefore, the same set of questions may be
100
Construction
Operation
directed to more than one person for each project. The data will be collected
from the individuals who perform the five basic functions of the model.
The questions are nonleading, and are structured so that the interviewee
would not be biased and answer freely, given the relatively short list of
questions. Each question is accompanied by a set of items that must be
examined in the answers. These items are based on the literature search
and are intended to aid in refining the replies and comparing the information
obtained from different interviewees.
Selection of Projects
To analyze the data meaningfully, the writers carefully selected eight pairs
of projects. The two projects in each pair were similar in scope and proposed
by the same sponsor. One project was successful in the eyes of the sponsor
and the second was less successful. To compare projects in a given field,
three categories were established. These categories all included projects in
which there were significant M, P, D, C, and O functions to be performed.
An ideal project fell in the range of 100,000-200,000 sq ft (9,467-18,935
m2) in area, but some projects were larger. The categories selected were:
(1) Owner-occupied, lease-equivalent space (O), e.g. office and retail build-
ings; (2) high-tech, high-reliability buildings (H), e.g. lab facilities and hos-
pitals; (3) multiphase projects in which the occupant is in the building (R),
e.g. renovation work and multiphase buildings.
A summary of the projects together with their reference numbers is as
follows.
* Sponsor 1—Projects HI, H2, Rl, R2
* Sponsor 2—Projects Ol, 02, H3, H4
* Sponsor 3—Projects 0 3 , 04, H5, H6
« Sponsor 4—Projects H7, H8
o Company 5—Projects 05, 06
Note that sponsor four was only able to provide two sites. The researchers
101
Data-Reduction Procedure
The data collected from each project were recorded directly on the ques-
tionnaire. These data were reduced onto the data analysis sheet (Figs. 5
and 6). Given this sheet, two data descriptions were prepared.
The first type was a verbal description of the information on each project
provided under the following headings: facts, functions, and lessons learned.
The facts section defined the project background and description of the
building, the key players and contracts, and cost and schedule data. The
functions section defined how the facility team, contracts, facility experi-
ence, resources, product information, optimization information, and per-
formance information were performed or achieved. The final section listed
the lessons learned for each principal player.
A second type of data reduction was the conversion of the questionnaire
data to a numerical system. This is presented in detail later in this paper.
Sample results of the textual information follows.
Project Number O l
Facts
Background and description: New operations center, data processing cen-
ter [400,000 sq ft (37,870 m2)] built in northeastern U.S., built on a new
site. Raised floor, modular electrical wiring system, uninterrupted power
supply (UPS), emergency power, halon systems, universal wiring system,
access floor, security systems, sound masking, heat recovery.
102
PRODUCTS Site
Constructed Facility
Operational Facility
Functions
Facility team: The owner had little experience and selected a project
developer to assist him in this task. A participative management style was
employed, which ensured a good team. The construction manager (CM)
was hired at the same time as the architect.
Contracts: The architect and design team had fixed-fee contracts. The
103
SUCCESS Manager
CRITERIA
Planner
Designer
Constructor
Operator
SUCCESS Manager
Planner
Designer
Constructor
Operator
LESSONS
LEARNED
fee was tied to the principals of the key designers' staying with the company
for the duration of the project. The contracts all allowed the participants
to function as a team.
Experience: The developer had experience in this area, but it was the
largest such facility done by them. The mechanical/electrical designer had
had a very similar project before this one. The design team had experience
working together.
Resources: The CM's previous job suffered from insufficient staff. On
this project the project executive made sure that the staffing level was
correct.
Product information: The architect ernployed a team-building program
to develop a common frame of reference, program, and philosophy on the
project. The coordination using an overlay drafting system on the project
was unsatisfactory. The contractor lost momentum due to frequent changes.
Optimization information: The contractor provided good value engi-
neering (VE) input. The team had to work to change the philosophy of the
operator.
Performance information: The user's chief executive officer (CEO) was
involved in weekly reviews. Good control.
External constraints: Unusual spells of dry and wet weather and sinkholes
constrained the contractor. The owner's home office staff changed their
104
minds on the size of the facility. The windows that were supplied did not
meet the design specification.
Product: Good.
Success: Generally the most successful job in all the companies.
Lessons Learned
Owner: Remove weak team members as soon as possible. Clearly state
all assumptions in the design phase and ensure that the client knows and
understands what they are. Hold multiple contracts with all key designers;
don't have one contract with an architect only.
Architect: Educate the owner and community on what quality of building
they get for their funds. Take tours of similar facilities to communicate the
space and quality of finishes.
Engineers: Spend the time to get to know the project manager. Spend
time up front developing the team.
Contractor: Used the same organization and staffing levels on subsequent
projects successfully.
Project Number 02
Facts
Background and description: New corporate office headquarters [110,000
sq ft (10,414 m2)] built in the northeastern U.S. on an urban restricted site.
Structural steel frame, self-contained air conditioning (AC) with roof-mounted
dry cooler, and automated control system. Masonry walls with brick facade.
Key players: Project managed by owner and consultant project manager;
design by architect with specialty subs (negotiated lump sum); construction
manager (cost plus fixed fee with a GMP); in-house operator.
Design start: 4/86.
Construction: 9/88 to 10/89.
Final construction cost at turnover: $12 million.
Functions
Facility team: The owner brought in a project manager to assist them.
The project manager had no authority, and dragged out key project deci-
sions. There was a large amount of friction between the project manager,
architect, and mechanical designer. The project management team changed
during the project. The new project manager was abusive, and the team
could not function well.
Contracts: The architect's and the mechanical designer's fees were cut
drastically by the owner. A GMP construction contract was developed with
very poor documentation.
Experience: The owner had no experience and the project manager had
very little experience. The operator had absolutely no experience, and at
the time of the site visit, the project had its third operator.
Resources: Available.
Product information: The owner had many changes of mind because the
large committee representing the owner later in the project had been left
out in the earlier phases. The architect had to revise the program done by
another architectural firm. The GMP documents were incomplete, but con-
struction documents were fairly good.
Optimization information: The owner was brought in too late to the
process and made scope changes. The contractibility input by the contractor
105
was poor. It was hypothesized that the contractor had a vested interest in
keeping the budget low until they were required to guarantee the price. No
operability input was received.
Performance information: The absence of quality control was the only
deficiency in this category.
External constraints: The site size and height restrictions severely limited
the building. Severe changes from the owner's home office impacted the
project.
Product: Poor site and poor operations of facility.
Success: Average to poor.
Lessons Learned
Owner: Hard to work with an architect located off site. Must have a good
on-site project manager. Must have high-level people involved in the proj-
ect. A precise delegation of authority is essential.
Architect: The owner's decision-making cycle must be quick and respon-
sive. Resist a project manager and work for structured owners. Pay the fee
requested.
Engineer: Insist on user-groups involvement during the design phase.
Contractor: Evaluate subcontractors and material suppliers at the end of
the project. Panelize as much millwork as possible on projects of this type.
NUMERICAL RESULTS
The data obtained from the questions is transferred onto a data analysis
sheet (Figs. 5 and 6) for each project. The data analysis sheet requires the
researcher to evaluate the quality of each major element being performed
for each of the five project participants identified.
Thus, each subcategory, e.g. facility team, has five possible elements.
These elements represent the teams provided to manage, plan, design,
construct, and operate the facility. To rank the quality and timeliness of
the element associated with a particular project function, there must be
agreement from the parties questioned and physical evidence, where pos-
sible, regarding the existence, timeliness, and quality of the subject subele-
ment. For example, in evaluating the planning information or program, the
researcher would check the response of the planner and designer, since the
program flows from the planner to the designer.
Fig. 7 provides a numerical representation of the data collected from the
projects. The top section of the figure indicates the closeness of fit to the
ideal situation predicted by the model; the lower section represents the
participants' views of project success. The product category indicates par-
ticipant's views of the building end product as a physical item; the success
indicators define the broader project issues such as profit, meeting objec-
tives, and reputations.
The values for such subcategory in the left-hand column of the upper
section of Fig. 7 are derived as follows. The manager, planner, designer,
contractor, and operator are all ranked as either 0 or 1 for each element of
the subcategory. A 0 means that everything was performed correctly; a 1
indicates that this function was below average or not performed. The num-
bers are then added for each sub category to give the indicated score (shown
in Fig. 7 as a single entry). A low score of 0 indicates good performance;
a high score of 5 indicates poor performance. The variable at the bottom
of the upper section, "CFIT," indicates the closeness of fit to the modeled
106
External 3 2 3 2 3 2 1 3 2 2 3 4 3 1 1 1
Products 0 0 0 0 0 2 0 0 2 1 1 1 0 0 0 2
SUCCESS
Manage 5 4 4 2 5 4 5 4 4 5 3 3 2 4 5 4
Plan 3 3 4 5 5 3 4 5 4 5 4 3 3 4 5 5
Design 3 5 1 4 1 3 5 4 2 3 4 5 4
Construct 4 3 5 2 5 3 4 2 4 5 5 4 4 3 5 3
Operate 3 5 2 2 4 4 4 3
SUM ALL 12 13 13 9 20 11 17 12 18 25 18 14 16 19 24 19
NO ALL 3 4 3 3 4 4 4 4 5 5 5 5 5 5 5 5
SUMMDC 12 10 13 9 15 9 13 9 11 15 11.5 9 10 11 14.5 11
SALL 0.80 0.65 0.87 0.60 1.00 0.55 0.85 0.60 0.72 1.00 0.72 0.56 0.64 0.76 0.96 0.76
SMDC 0.80 0.67 0.87 0.60 1.00 0.60 0.87 0.60 0.73 1.00 0.77 0.60 0.67 0.73 0.97 0.73
functions for the whole project. A low score is 0 (excellent) and the max-
imum score is 35 (worst case).
The calculation of CFIT did not include data from the two question
subcategories "External" (constraints) and "Products," which are included
as two separate line items. External constraints describe factors beyond the
control of the project team, not functions performed. Products describe the
satisfaction of the respective players with the facility. These two were in-
cluded in the questions to aggregate these comments and separate them
from the required subcategory data.
The lower section of Fig. 7, titled "SUCCESS," measures the participant's
success ranking of the project. The ranking scale for success is as follows:
1 = very unsuccessful; 2 = poor; 3 = average; 4 = better than average;
and 5 = outstanding. In some cases success rankings could not be collected,
and these fields in the figure are left blank.
The numbers at the bottom in the row "SALL" are a weighted percentage
rating given the success ratings. The second number, "SMDC," is developed
using a combined manage and operate score; a combined planning and
design score; and a construction score; each weighted as three equals. In
both cases ("SALL" and "SMDC") 1.0 indicates a very successful job, and
0 indicates a very unsuccessful project.
SALL and SMDC are calculated as follows:
SUM ALL
SALL (1)
NO ALL-5
SUM MDC
SMDC (2)
3-5
where NO ALL = number of success responses used in the calculation.
ANALYSIS OF RESULTS
Several types of analysis are discussed. First, projects are grouped into
pairs of similar buildings. Both projects in each pair were proposed by a
107
1. Rank the closeness of fit (CFIT) of each project in each group (see
"CRANK," in Fig. 8).
2. Rank the success (SMDC) of each project in each group (see "SRANK,"
in Fig. 8).
3. Compute Spearman's rho between rankings in steps 1 and 2
External 3 2 3 2 3 2 1 3 2 2 3 4 3 1 1 1
Products 0 0 0 0 0 2 0 0 2 1 1 1 0 0 0 2
CRANK PAIR 1 2 1 2 1 2 1 2 2 1 1 2 2 1 1 2
CRANK SPONSOR 1 3 2 4 1 4 2 3 2 1 3 4 4 2 1 3
CRANK TYPE 2 6 1 2 1 6 3 8 4 2 4 6 5 3 1 4
CRANK ALL 3 11 4 14 1 16 4 13 8 4 9 11 14 7 1 9
SUCCESS
Menage 4 4 4 2 4 4 4 4 4 5 3 3 2 4 5 4
Plan 3 3 4 4 4 3 4 4 4 4 4 3 3 4 5 5
Design 3 4 1 4 1 3 5 4 2 3 4 5 4
Construct 4 3 4 2 4 3 4 2 4 5 5 4 4 3 5 3
Operate 3 5 2 2 4 4 4 3
SUM ALL 12 13 13 9 20 11 17 12 18 25 18 U 16 19 24 19
NO ALL 3 4 3 3 4 4 4 4 5 4 5
. 5 5 4 5
SUMMDC 12 10 13 9 14 9 13 9 11 li 11.5
sy 10 11 14.5 11
SALL 6.80 0.6$ 0.81 0.60 1.00 0.44 0.85" 0.60 0.72 1.00 0.72 0.56 0.64 0.76 0.96 0.76
O.tJO 0.67 0.87 0.60 1.00 0.60 0.87 0.60 0.73 1.00 0.77 0.60 0.67 0.73 0.9^7 0.73
mac
SRANK PAIR 1 2 1 2 1 2 1 2 2 1 1 2 2 1 1 2
SRANKSPONSOR 2 3 1 4 1 3 2 3 3 1 2 4 4 2 1 2
SRANK TYPE 3 6 1 2 1 6 2 7 3 1 4 7 5 3 1 5
SRANK ALL 6 11 4 13 1 13 4 13 8 1 7 13 11 8 3 8
d sponsor 1 0 -1 0 0 -1 0 0 1 0 -1 0 0 0 0 -1
Rho sponsor 0.8 0.9 0.8 0.9
dtype 1 0 0 0 0 0 -1 -1 -1 -1 0 1 0 0 0 1
Rho rhoH 0.94 rhoO 0.94 rhoR 1
dull 3 0 -1 -1 0 -3 0 0 0 -4 -2 2 -3 1 2 -1
Rho all 0.91
108
Pairwise Analysis
In this section, projects are grouped into pairs of similar buildings. Both
projects in a pair were proposed by a common sponsor company. The results
are analyzed to determine whether projects that were closer to the modeled
structure were more successful than those that were further from the model.
The pairs selected for comparison were HI and H2; R l and R2; Ol and
02; H3 and H4; 0 3 and 0 4 ; H5 and H6; 0 5 and 06; and H7 and H8.
"CRANK PAIR" and "SRANK PAIR" indicate the ranking of CFIT and
SMDC, respectively, for each pair. In all cases, the project that had a higher
score for CFIT (see Fig. 8) was the poorer performer as indicated by SMDC.
The rho for each project was 1.0, indicating perfect correlation.
Analysis by Sponsor
The following five groups of projects with the same sponsor were ana-
lyzed.
COMMENT ON RESULTS
All four methods of analysis showed that projects closer to the modeled
structure or those where the elements of the function subcategories were
properly performed enjoyed better success for all parties than those projects
further from ideal. This held true for project pairs, those proposed by the
same sponsor, those in the same industry market segment, and for all proj-
ects analyzed together. This correlation indicates that the seven function
subcategories and their 35 elements are indeed project success factors.
When examining Fig. 8, the reader can infer the relative importance of
the success factors. The facility team; contracts, changes, and obligations;
facility experience; and optimization information varied significantly be-
tween successful and unsuccessful projects. These four factors can then be
deemed the critical project success factors (CPSFs).
One factor, the product information (i.e., design documents and the
109
CONCLUSIONS
The initial objectives of this research were to define the critical factors
that lead to project success and provide a forecasting tool to enable parties
to rapidly assess the possibility of a successful project from their viewpoint.
These, general objectives were met through the accomplishments of the
research. More importantly, a list of specific factors were identified as critical
to the success of projects.
The topic of this research required a critical look at projects that were
deemed less successful by their nominators, thus creating potential embar-
rassment to those involved or noncooperation with the research. The re-
search team was able to avoid this danger successfully; made possible by
the excellent cooperation and support the Consortium for the Advancement
of Building Sciences (CABS) members provided. Other participants in the
projects also displayed a remarkable spirit of cooperation and genuine in-
110
teres! in the successful outcome of the research. Under these conditions the
writers were able to bring the project to successful conclusion.
The research team set out to study CPSFs applicable to building projects,
based on established theoretical work in construction process modeling. The
research was successful in identifying a number of CPSFs based on extensive
field research on 16 selected construction projects. Clearly, some questions
await further study. Although more projects are necessary to provide con-
clusive evidence, this work provided significant insight into the essential
elements of project success.
ACKNOWLEDGMENTS
APPENDIX. REFERENCES
111