Assessment of Building Performance in Use: The Probe Process
Assessment of Building Performance in Use: The Probe Process
Assessment of Building Performance in Use: The Probe Process
by
Robert Cohen, Halcrow Gilbert
Mark Standeven, Energy for Sustainable Development
Bill Bordass, William Bordass Associates
Adrian Leaman, Building Use Studies
Introduction
In use, buildings do not always work as intended. Some features perform better, some worse,
some differently. Chronic low-level problems persist. Surprising things happen which nobody
anticipated. Many recent buildings and their services demand high levels of support and
vigilance, which are not always within the capacities of the occupiers. Others are more robust.
To improve building performance overall in a changing market, the industry and its clients need
to identify opportunities and pitfalls through rapid feedback. Unfortunately, this seldom
happens. When it is, much is often confidential, and at best made available unattributably or
anecdotally. In the UK, the Probe (Post-occupancy Review Of Buildings and their
Engineering) – research project has attempted to improve the openness of this feedback by
undertaking post-occupancy evaluations (POEs) of recently-completed non-domestic buildings
and reporting them in Building Services Journal (BSJ), the journal of CIBSE, the Chartered
Institution of Building Services Engineers. The project started in 1995, and fifteen surveys
have been published so far. The results and implications of these are described elsewhere
[Probe 2000 b to f].
Probe’s main purpose was feedback to building services engineers of generic and specific
information on factors for success, and areas of difficulty and disappointment. The results
have proved of wider value, not only for designers but for example in helping commissioning
clients in preparing briefs and identifying issues which require continued review throughout the
procurement process. Building occupiers will also be interested in what to look for and ask
about in new premises; and strategies for operating and improving existing ones.
Background to Probe
The 1980s were a time of major change for buildings. In particular:
• Pressures from competition, government and clients were forcing the design and
production side of the industry to improve speed and quality and reduce costs.
• Buildings and the building industry had to adapt to the needs of information technology.
• Buildings were used more intensively and changeably, and so could also require (or at
least be felt to require) more air conditioning.
• The facilities management profession began to establish itself.
• Falling fuel prices reduced the importance of energy costs for occupiers, but
• a growing concern for all aspects of environmental performance also kept energy
efficiency on the design agenda.
Buildings in use also began to be studied more consistently, largely as single-issue studies,
particularly energy performance; internal environment; and occupant satisfaction. Major gaps
were often found between client and design expectations and achieved performance. Building-
related ill-health was studied and the term “sick building syndrome” coined.
Draft BRI paper on Probe process
In the early 1990s the industry was tackling these new agendas, for example using:
• computer models and advanced natural ventilation (ANV designed using simulation
methods and often operated automatically – at least to some extent) to avoid air-
conditioning;
• building fabric and glazing systems with better thermal and daylight performance;
• more efficient air-conditioning (AC), e.g. with displacement ventilation and static
cooling;
• mixed-mode (MM) solutions, combining natural with mechanical ventilation and cooling;
• a variety of plant efficiency and control improvements.
Clients were also broadening their interests: indoor environments, occupant health, productivity
of staff, and higher level strategic business issues. However, gaps - sometimes large -
between expectation and achieved performance were being reported. Hence the need for better
feedback to consolidate the benefits and to identify and correct any problems.
A proposal was submitted, in which the government would pay for post-occupancy
investigations of technical and energy performance and occupant satisfaction in eight recently-
completed building which had been featured in BSJ. This contribution was matched by BSJ’s
editorial and publication costs. Additional input to the project was obtained by in-kind
contributions of time by the Probe survey team and, of course, the time spent by the building
occupiers and their advisers in supporting the surveys.
The proposal was successful, demonstrating how this new approach to government funding
could stimulate and support innovations in real-world research and dissemination . The project
started in mid-1995, with the following team of investigators:
• Halcrow Gilbert, who were heavily involved in applying advanced design tools to the
design of low-energy buildings.
• Building Use Studies Ltd (BUS), who monitor occupant satisfaction with buildings
through questionnaire and interview techniques.
• William Bordass Associates (WBA), who undertake and manage investigations into
technical and energy performance of buildings, and use the findings in project-specific
advice to clients and design teams, and in published guidance material.
In early 1997, funding was obtained for Probe 2. This had a broader scope, including
procurement issues, water economy, and a pressure test for airtightness. Following reader
demands for faster feedback, Probe 2 aimed to look at buildings after two years of operation
(earlier than this, they, their occupants and their management had not yet settled-down; and a
history of energy consumption was not available). Probe 2 included another eight buildings:
three offices (one AC, one MM and one NV), a naturally-ventilated (NV) and two MM
educational buildings (one with some ANV), a MM courthouse and a NV warehouse. Table 1
lists all the buildings studied in Probes 1 and 2, with their principal characteristics and the dates
of the articles in BSJ. The results of these surveys are outlined in [Probe 1999 b to d].
Draft BRI paper on Probe process
Survey methods
Probe has shown that it is possible to undertake an extended series of post-occupancy surveys
of named buildings and to publish the results in a technical journal. To our knowledge, this
had not been done before, having been regarded as too difficult and too risky. In the UK,
Probe has helped to contribute to greater better understanding of how buildings perform and to
greater openness in discussing where problems are and what measures are needed. In this way
it supports current initiatives by government, industry and clients intended to bring about
substantial improvements to the performance of the building industry and its products [DETR
1998b].
To produce reports of sufficient rigour and credibility, the methods employed needed to be
standardised, and to use existing techniques and benchmarks where possible. The backbone of
Probe has been two established tools:
• The occupant survey method developed by Building Use Studies (BUS) to gauge
occupant satisfaction with the building and its internal conditions.
• A prototype of the Energy Assessment and Reporting Method’s (EARMTM) Office
Assessment Method (OAM) for the analysis of energy use. The Probe team assisted its
development and testing. A spreadsheet version with accompanying guidance has
recently been published by [CIBSE 1999].
In the course of Probe, a comprehensive five-page standard pre-visit questionnaire (PVQ) was
also developed to help identify and collect initial information on the building and its services,
occupancy, usage and management. This was able to improve the effectiveness of the first
visit, and focus the host’s mind on the Probe team’s activities and information requirements.
The questionnaire was issued typically to a sample of 100-125 occupants; or to all occupants in
the smaller buildings. Sometimes a shorter secondary questionnaire was also given to specialist
user groups, typically students in an educational building. Staff focus groups and meetings
with management were also held when opportunities arose.
Draft BRI paper on Probe process
A relatively small core data set releases more time for managing the wider data set, making
benchmarking achievable. Nevertheless, there is still a prodigious amount of potential
information. As more buildings are added to the dataset, the burden of data management and
quality control increases. However, the larger knowledge base makes the information gained
more valuable, so early trade-offs between "must have" and "nice to know" questions become
even more important. The decision to use tried-and-tested methods to collect both energy and
occupant data, and not greatly altering these techniques as the study progressed, is a major
reason for Probe's eventual success, especially in benchmarking all-round performance.
To provide statistical snapshots of occupant responses, Probe uses two summary indexes:
• One based on comfort, see figure 1, with scores for summer and winter temperature and
air quality, lighting, noise and overall comfort.
• One on satisfaction (not shown), using ratings for design, needs, productivity and health.
These indexes are usually the first step in presenting results on a particular building. For
example, buildings may score highly for satisfaction but less well for comfort; or well on both.
Scores based on the averages of occupant responses to a particular question in each building
can also be presented on graphs with their benchmarks (here averages based on a dataset made
up from the last 50 buildings surveyed by BUS). Confidence intervals can be shown to
emphasise that they are based on sample statistics, and so subject to variations owing to sample
size, variance of responses and random fluctuations. This permits visual checks of whether the
average perception of a building differs significantly (in a statistical sense) from the benchmark,
the scale midpoint or another building. For example, figure 2 shows ratings for glare from sun
and sky for one Probe building; and figure 3 the statistics for each of the Probe buildings,
together with the benchmarks from the BUS dataset:
• If the range for a particular building is intersected by the line for the benchmark mean,
then occupant perceptions of that variable in that building are not significantly different
from the benchmark (e.g. Building 10).
• If the scale midpoint intersects the range, then the building is not significantly different
from the scale midpoint (e.g. Building 8).
• If a mean for a particular building intersects the range for another, then the buildings are
not different from each other (e.g. Buildings 4 and 7).
• If the range is large, one needs look at the distribution of individual scores to check why.
For example, people near windows may be significantly happier than those further away;
and top floors of buildings with open atria are often hotter.
Further details of the survey statistics can be examined on the website usablebuildings.co.uk.
In buildings, physical design issues and human and management issues are inextricably linked:
complete separation of influencing factors is not possible. This can be compensated to some
extent by reporting results in ways that permit readers to identify particular contextual factors.
For example, most occupants prefer to have their own offices, because of the additional privacy
and control that this gives, so high scores are easier to attain in these contexts. Conversely,
occupants who are less happy with their work, managers and colleagues and may project their
dissatisfaction onto the environment and facilities; and use complaints about the physical
environment as a risk-free way of protesting about management.
The energy survey method
Energy data had to be collected and reviewed without comprehensive monitoring; although
independently monitored data was available for three of the buildings. Probe therefore intended
to use a method which had been developed by WBA when undertaking energy surveys in
offices [EEO 1994]. This was based on WBA’s own electricity consumption estimation and
reconciliation procedure, together with the London Energy Group’s energy reporting format
[London Energy Group 1983], both implemented on Excel spreadsheets.
As Probe started, DETR’s Energy Assessment and Reporting Methodology (EARM™) project
was just finishing. Amongst its products was a paper-based prototype Office Assessment
Method (OAM) which incorporated elements of the WBA procedure. Probe 1 obtained
additional funding from DETR to test this method on the four office buildings surveyed. With
as hoc modifications, the OAM approach was also used for other building types.
EARM-OAM is an iterative technique which allows one to get the best possible result within the
time available. It helps the user to seek the most significant items of missing data to bring the
picture into focus. It allows the energy performance of a building to be understood in terms of
building type, fabric, systems, occupancy and management; and includes a spreadsheet which
provides instant reconciliation between survey estimates and metered data.
OAM was developed to resolve problems which had been identified during research into the
energy assessment of buildings:
• Critical data, especially fuel consumption and the factors used for normalisation (most
importantly the floor area) were often of dubious quality. This is resolved by careful
definitions of critical data items, specifying preferred sources for the data, and including
simple data quality checks. The quality checks allow the user to carry on even if the
preferred data sources are not available: initial research showed that procedures which
insisted too specifically on data in one particular form before they provided any output
tended to put people off. Instead, the result is “flagged” as being of reduced accuracy. If
better data then becomes available, or if further investigation is justified, the user may
then return to make improvements.
• Many existing methods are either not precise enough, not sufficiently relevant, or too
demanding and time-consuming. The OAM therefore allows progressively detailed
assessments and helps the user to decide whether the conclusions are sufficiently reliable
and what further work may be required.
• The data can be re-visited by others at a later stage for review and updating.
The OAM makes extensive use of related energy performance indicators for offices from
Energy Consumption Guide 19 [DETR 1998a]. It also allows unusual features or usage to be
separately identified, helping one to move away from standard benchmarks to the most
important issues for a particular building.
• Stage 3 looks at the finer details. A key tool is an electrical consumption estimation
spreadsheet, which undertakes “load x hours” calculations by individual items, areas, or
subsystems, and reconciles this with metering data; including if necessary and where
available day/night, summer/winter and weekday/weekend load patterns. The results can
then be related to subsystem benchmarks, also making allowances for special cases.
At all stages the approach is also iterative.
OAM proved valuable in improving the speed, accuracy and consistency of the energy
assessments. Although survey time was limited, it was essential to review performance against
benchmarks for individual end-uses, so all Probe analyses proceeded into Stage 3. The paper
version of OAM tested in Probe 1 proved too time-consuming, requiring frustrating double- or
triple-entry of some data items, and was not efficient if figures subsequently changed to the
degree that the earlier stages had to be revisited. Following the pilot tests, the Probe team
found it most efficient to jump straight to the Stage 3 spreadsheet, which allows a model of
electricity consumption by end uses to be built up and reconciled with fuel bill data.
As Probe 2 was coming to an end, the authors of CIBSE TM22 gave the Probe team evaluation
copies of the Excel Workbooks on which the whole of the OAM is now implemented. These
were much quicker and more convenient to use than the former partly paper-based version.
They:
• avoid multiple entry of input data and laborious recalculation;
• provide inbuilt checks of data quality, consistency and orders of magnitude;
• show automatic numerical and graphical comparisons with selected benchmarks;
• generate “tree diagram” benchmarks [Field et al 1997] automatically for system capacity,
efficiency, operating hours and control and management factors; and
• keep track of special features and reduced accuracy items.
The workbooks also provide a clearer and more complete record of the work undertaken. This
approach can greatly improve the language of energy target-setting, assessment and reporting at
all stages of a project from briefing onwards. The technique, together with the BUS occupant
survey has also been successfully piloted on a one-visit survey to a large office of a financial
services company in the Netherlands.
Some buildings have required an additional short preliminary visit to make contact, collect key
facts, check metering arrangements, and review whether a Probe is practical from both the
team’s and the host’s point of view. Others have needed an extra final visit to review findings
personally with the hosts and/or the designers; to discuss inconsistencies and uncertainties
which have arisen in preparing the final report; and to collect additional information.
Figure 6 shows the ten key stages the process, with typical timing. It takes typically two
months from the first technical survey visit to completion of the technical report, plus another
month for editing, review and publication. The central seven stages are chronological. The
timing of the other three can vary:
• Stage 6, the occupant survey. BUS does this either at or before the second visit. If
before, it raises useful questions to be pursued at the visit. If at the same time, it permits
more fruitful discussion during the visit but reduces the time for technical survey work.
• Stage 7, the energy analysis, continues throughout the project. Results come gradually
into focus as more information is collected and questions are asked.
• Stage 8, the pressure test. BRE or BSRIA normally does this once it is clear that the
Probe will be proceeding to publication; and usually shortly after the second technical
visit, when the Probe team has confirmed arrangements with the host.
Draft BRI paper on Probe process
For design information, the initial BSJ articles gave valuable background descriptions, plus
tables of key statistics. For buildings without such published information, data gathering will
take longer, unless the occupier can themselves pull together a good package of information for
the survey team. An additional preliminary site visit may then be required; plus contact with the
designers to obtain design intentions, technical, cost and procurement details, and any unusual
requirements or difficulties. This may be difficult: for several Probe buildings, the individuals
involved had already become widely dispersed and information archived.
Availability of design, operational and energy information varied greatly. In some buildings,
particularly the large air-conditioned offices, the management had facts at its fingertips. Many,
however, had poorer access to information. In practice, it was rare to receive a fully-completed
PVQ, fuel bills or floor plans before the first visit. However, even a partially-completed PVQ
helped to prepare the occupier for the demands of the survey; to structure the first interview;
and to smooth the process of identifying and obtaining missing items.
Spot measurements are taken where possible. Basic equipment always carried includes a true
power meter, light meter, temperature/RH indicator, smoke pencils and a camera. Hot wire and
vane anemometers, moisture meters, and electrical demand profile recorders are also used.
Temperature and RH logging has not been undertaken: in five of the buildings good
information was available from site management or through independent monitoring. In others
BMS records were helpful but often patchy, with little systematic trend logging.
Spot measurements also provide informal opportunities for talking to staff. For example,
taking an illuminance level leads to a conversation about lighting and glare, and quickly on to
other things, perhaps erratic control of the air-conditioning or the difficulties in getting lunch on
a business park. Such discussions have often opened-up new avenues of investigation.
Draft BRI paper on Probe process
Hosts are constrained by the pressures of their jobs, and sometimes emergencies lead to
delayed or curtailed meetings, or escorts who run off in response to emergency call-outs. A
degree of opportunism is therefore necessary to extract the most relevant information in the time
available: for example visiting the accounts department to find fuel bills; or studying plans,
specifications and manuals when one had intended to survey the plant room.
Initially the team tried to collect data on a laptop computer. This proved unsuitable, as
information on the building, its use and its systems arrived so rapidly and from so many
directions that any attempt to force it into a standard form took longer and interrupted the free
flow of the host’s comments. However, computer data collection may now be more practical
with today’s more powerful hardware and software, including the EARM-OAM workbook.
In this stage (which takes typically four weeks including waiting time for requested data) the
text of the final report is also drafted. This draft helps to draw attention to gaps in the verbal
and numerical arguments; generate checklist items for the second visit; and provide valuable
briefing for Probe team members who have not visited the building - in particular those who
will be involved in the second site visit and the occupant survey.
The visit gives the survey team a fresh look at the building; discuss preliminary findings with
the host; resolve uncertainties and inconsistencies and to look more deeply into specific issues.
On the energy side, a lot can be learnt from being in the building at night, to identify whether
plant switches off to programme, and what lights and office equipment are left on. Such
extended surveys have not been affordable on Probe; and increased building security over
recent years have also made night visits difficult. Indeed, there has been a general tendency for
occupants to require Probe team members to be accompanied, particularly in special areas and
plant rooms; though for general areas the requirement often relaxes as confidence builds up.
Permanent office staff are normally sampled. Sometimes shorter questionnaires are also given
to specialist groups, e.g. students, visitors or – in a court building - magistrates. To obtain
consistent survey results and high response rates (typically above 90%), it is essential that:
• The host contact obtains prior approval from the relevant managers.
• Staff to be surveyed are forewarned of the purpose and date of the survey.
• The survey forms are handed out personally by BUS and not circulated through the
internal mail or email. This allows the sampling to be controlled, gives opportunities for
personal discussion about the purpose of the survey and how to fill in the form, and
confirmation of the collect-up time.
Draft BRI paper on Probe process
• The forms are consistently produced, attractive to look at, and easy to fill in, with most of
the questions in seven-point "Gregory" scales with tick-boxes.
• There are spaces (but not too large!) for people to provide their personal comments on
both specific and general issues.
• The forms are distinctively coloured so they can be easily identified, both by the
respondent and by BUS when collecting them up.
• The forms are collected-up the same day. Typically they are issued mid-morning, with
the first collection before lunch and the second in the early afternoon.
Data entry takes place typically in the week after the survey, with analysis and reporting in the
week after that. High response rates are important, in order to avoid suspicions about the
statistical validity of the results and to permit analyses of sub-samples (e.g. between people
who have window seats and those who do not). Significance tests are used on all variables and
written comments are included only for items which are statistically significant.
STAGE 7. Energy analysis (ongoing throughout the project, completion Week 12)
Energy analysis proceeds throughout the survey, with usually the most intensive period
between the first and second visits. Good fuel consumption data is an essential input to the
assessment process: this came from a range of sources including monthly or quarterly invoices,
site meter readings, or directly from suppliers on paper or electronically.
For effective comparison with benchmarks, it helps to review how energy consumption varies
with weather and occupancy. Sadly, In only a few buildings could reliable profiles of monthly
gas consumption be established and the influences of climate, control and management
assessed. Gas bills often included a high proportion of estimated meter readings, and site staff
seldom read their own meters. Here it was only possible to extract an annual gas consumption
figure, estimate the climate-dependent proportion and normalise this for total annual degree
days. Occasionally some additional information was available from Transco, the company that
conveys gas about the UK but does not normally supply it to the end-user.
On the smaller sites some electricity bills were estimated too. On the larger ones, monthly
electricity bills nearly always had good consumption and load data. Full half-hourly records
are often available, but in practice it is not always easy for third parties like the Probe team to
obtain the data within the time available: and the utilities did not always accept the original
letters of authority that were sent to them.
The Probe pressure tests highlighted some apparent inconsistencies in current procedures and
reporting conventions, in particular the standard quoted test pressure (25 or 50 Pa); envelope
area measurements (in particular the treatment of ground floors); and dealing with designed
natural and mechanical ventilation openings. This suggested a need for a definitive technical
guide on pressure testing non-domestic buildings, with associated standards for measurement
and reporting.
Draft BRI paper on Probe process
For the most part, the tension inevitable in such a process has been resolved well. Indeed, this
very tension may have helped to create the unprecedented interest in the Probe project: it is
human nature for critiques of buildings one knows by people firms one may also know to be
read more avidly than generic messages. The Probe team’s objective to provide suitable
information, messages and steers to the industry and BSJ’s need to engage the attention of its
readership supported the aims of the ultimate client, DETR.
Draft BRI paper on Probe process
The format and content of the Probe articles was discussed at a designer feedback seminar on
Probe 2 [BSJ 1999]. Participants felt that Probe articles were well read, in particular the key
design lessons summarised at the end. Proposed additional topics included:
• Transport use and emissions
• Capital and maintenance costs
• BREEAM implementation in the as-built design
• Embodied energy
• The relationships between any modelling predictions used in the design (dynamic thermal
computer modelling, CFD, and so on) and the performance of the completed building.
It would be difficult to include all these in a Probe at current budget levels, but future studies
will aim to include perhaps one extra topic from the list most relevant to the building concerned.
In addition, the Probe team intends to find ways of using feedback from Probes 1 and 2 to help
to assist the property industry, clients, and building occupiers in their efforts to improve
building performance; and to report on the outcomes of these interventions.
Most surveys, however will probably be done on behalf of the occupier or the building team.
Here priorities will be different from those in Probe. For example, in a published article the
world needs to be told about factors for success (e.g. perhaps in the procurement system) in
order to emulate them; and the problems (e.g. unsuitable window opening mechanisms) which
need to be avoided. Occupiers will often take successes for granted (though benchmarked
comparisons can be helpful); and can be well aware of the difficulties: they may well be more
interested in priorities, low-cost solutions, and better coping strategies.
An owner or occupier or owner might want a survey for a number of possible reasons, for
example:
• Ongoing programme of monitoring and benchmarking either by the building owners or the
design team, as part of a culture of feedback, service and continuous improvement.
• Management desire to improve environmental and energy performance.
• Response to general or specific occupant dissatisfaction or complaints, especially
concerning basic comfort, health or safety issues.
Draft BRI paper on Probe process
A culture of feedback
At a more general level, the Probe experience may help to bring routine feedback into the
briefing, design, construction, completion, operation, use and alteration of buildings. In spite
of clear needs, feedback is not a standard part of the design service. For example, while a
feedback Stage M is included in the current version of the Royal Institution of British
Architects’ Plan of Work [RIBA 1973], but it was omitted from the Standard Form of
Agreement (SFA) for the Appointment of an Architect published in 1992, apparently due to its
potential impact on professional indemnity insurance. However, the RIBA Guide to the SFA
[RIBA 1992] includes the following statement:
“Feedback, the last stage (M) in the RIBA’s model Plan of Work, is an
important but often neglected element of a commission. Much can be gained
from revisiting completed projects, and the client may also benefit from your
findings now that the building is in use. Even if you are not appointed for
stage M services, it can be valuable to keep in touch with the project.
It is all too easy to get dragged into providing services as a matter of good
will after the project has been completed. Remember that Architects are not
obliged to provide their professional services free.”
Probe has helped to expose the market to the idea of such feedback. The techniques of
collecting and presenting information have been streamlined and energy, occupant satisfaction
and air leakage performance and benchmarks have become more familiar. Professionals are
now more likely to admit and openly discuss shortcomings in systems and in-use performance,
where they and the industry need to improve. They are more able to make their clients more
aware of issues which need more careful attention than may currently be regarded as necessary
or affordable.
Buildings often have problems in the first year of operation. In other industries, these might be
regarded as routine prototype testing. However, the legal and contractual system hangs on to
the quaint idea that buildings are functionally complete when physically complete, and this
actually gets in the way of this after-sales service. Probe indicates that “sea trials” should be
seriously considered as a means of smoothing all but the simplest and most standardised of
buildings into occupancy, and making sure that problems (which may affect not only the
building but also its management) are rapidly identified and professionally dealt with. This
should be built into briefs, contracts and terms of appointment. It could include, perhaps, a
series of regular meetings, reviews of performance and energy consumption, and a contingency
budget for swift and effective remedial action. A building team may also be particularly
interested in the success (or otherwise) of some very specific aspects of their design.
Conclusions
Probe has used the UK Government’s Partners in Innovation funding to streamline post-
occupancy survey procedures, and to raise awareness of their results and implications through
regular press publication over a four-year period. This has helped to open-up discussion.
Greater awareness and more powerful and cost-effective techniques may help to make post-
occupancy evaluation more routine, and a powerful aid to continuous improvement of both
performance and the associated design and management benchmarks.
The UK government has recently embarked on a project – The Egan Initiative - to improve
buildings and their performance. This has received unexpectedly strong support from the
industry and its clients. To date Egan has focused more on the more efficient and cost-effective
supply of buildings and avoiding defects. This must soon extend to better in-use performance
in terms of occupant health, satisfaction and productivity, business and economic performance,
and sustainability.
Probe has permitted permit robust and insightful results to be obtained and publicised with a
limited budget. Some of the findings and their implications are discussed in the subsequent
papers. It has drawn attention to some of the factors for success, which are often related to the
processes of procuring, occupying and managing a building and the roles of all the parties
involved. The feedback has provided advanced warning of some of the problems associated
with new techniques and technologies and identified issues which need more attention. It has
also highlighted some persistent, chronic, minor problems (e.g. with the interfaces to control
systems) which receive relatively little attention but constantly frustrate the achievement of
potential levels of performance. By recognising and eliminating these in a culture of high
aspirations, routine feedback and continuous improvement, rapid improvements in all-round
performance will become possible.
Figure 1 Comfort index showing Probe buildings and BUS dataset
5.2
1
5
4.8
4.6
BUS Comfort Index
4.4 5
Probe
4.2
Dataset
4 10
3.8
3.6
3.4
15 © BUILDING USE STUDIES 1999
3.2
0 10 20 30 40 50 60 70 80 90 100
Percentile
© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Figure 2 Benchmark example for glare from sun and sky
Glare from 7
sun and sky
Too much
6
15
11
14
13
Scale midpoint 10 12
5 8
9
Bmk
Upper 95%
Benchmark 4
mean
Lower 95%
3
5
4 Building score
6 7
95% confidence upper
3
1 2 95% confidence lower
2
None
1 © THE PROBE TEAM 1999
Notes
Upper and lower ninety-five per cent confidence intervals are shown for 1) individual building means; 2) Building
Use Studies dataset benchmark for 50 buildings.
A building mean is significantly different from the benchmark mean if the mean value falls outside the interval range
for the benchmark mean. A building mean is significantly different from another building if its mean value falls out-
side the interval range for that building.
© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Figure 3 Benchmark example for glare from sun and sky (top) and overall comfort (bottom)
showing single study building and benchmarks
4.09 5.02
Unsatisfactory
Overall 1 2 3 4 5 6 7 Satisfactory
comfort
Benchmark 4.22
4.08 4.37 © Building Use Studies 1999
© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Draft BRI paper on Probe process
FIGURE 4:
THE OFFICE ASSESSMENT PROCEDURE
FIGURE 5:
FLOWCHART OF THE PROBE SURVEY AND REPORTING PROCESS
STAGE 1
Initial contact by BSJ
Preliminary agreement to survey MAIN ITEMS STUDIED:
Procurement route
Design and construction
STAGE 2 Initial occupancy
Contact by survey team
Review preliminary information Occupant satisfaction
Issue PVQ: pre-visit questionnaire Management perceptions
Initiate energy analysis Energy and water consumption
Operation and management
Maintenance and reliability
STAGE 3 Controls and controllability
First site visit Design intentions
Complete details of PVQ Alterations made
Walk-round survey Benchmark comparisons
Check on-site records Strengths and weaknesses
Confirm energy data availability
Seek approval to occupant survey, Key messages
pressure text, metering etc.
STAGE 4
Initial analysis
Additional information Review all information
requested from occupier, Draft descriptive report
contractors and utilities Do preliminary calculations
Identify outstanding items
Checklist for second visit
STAGE 8 STAGE 9
Pressure test Probe final report
by BRE or BSRIA Analysis and key messages
Published article in
Building Services - the
CIBSE Journal