Assessment of Building Performance in Use: The Probe Process

Download as pdf or txt
Download as pdf or txt
You are on page 1of 20

Assessment of building performance in use: the Probe process

by
Robert Cohen, Halcrow Gilbert
Mark Standeven, Energy for Sustainable Development
Bill Bordass, William Bordass Associates
Adrian Leaman, Building Use Studies

Introduction
In use, buildings do not always work as intended. Some features perform better, some worse,
some differently. Chronic low-level problems persist. Surprising things happen which nobody
anticipated. Many recent buildings and their services demand high levels of support and
vigilance, which are not always within the capacities of the occupiers. Others are more robust.

To improve building performance overall in a changing market, the industry and its clients need
to identify opportunities and pitfalls through rapid feedback. Unfortunately, this seldom
happens. When it is, much is often confidential, and at best made available unattributably or
anecdotally. In the UK, the Probe (Post-occupancy Review Of Buildings and their
Engineering) – research project has attempted to improve the openness of this feedback by
undertaking post-occupancy evaluations (POEs) of recently-completed non-domestic buildings
and reporting them in Building Services Journal (BSJ), the journal of CIBSE, the Chartered
Institution of Building Services Engineers. The project started in 1995, and fifteen surveys
have been published so far. The results and implications of these are described elsewhere
[Probe 2000 b to f].

Probe’s main purpose was feedback to building services engineers of generic and specific
information on factors for success, and areas of difficulty and disappointment. The results
have proved of wider value, not only for designers but for example in helping commissioning
clients in preparing briefs and identifying issues which require continued review throughout the
procurement process. Building occupiers will also be interested in what to look for and ask
about in new premises; and strategies for operating and improving existing ones.

Ultimately we hope that post-occupancy evaluation and benchmarking will be routinely


undertaken in new buildings, before and after refurbishments, and in facilities management.
Such feedback will help to make buildings better for their occupiers, individual users, and the
environment; and provide a continuous stream of information for benchmarking. This paper
outlines the techniques used; how they have been honed to obtain robust results under the
constraints of limited resources, strict publication deadlines, and public exposure of the results;
and their relevance to future surveys of this kind.

Background to Probe
The 1980s were a time of major change for buildings. In particular:
• Pressures from competition, government and clients were forcing the design and
production side of the industry to improve speed and quality and reduce costs.
• Buildings and the building industry had to adapt to the needs of information technology.
• Buildings were used more intensively and changeably, and so could also require (or at
least be felt to require) more air conditioning.
• The facilities management profession began to establish itself.
• Falling fuel prices reduced the importance of energy costs for occupiers, but
• a growing concern for all aspects of environmental performance also kept energy
efficiency on the design agenda.
Buildings in use also began to be studied more consistently, largely as single-issue studies,
particularly energy performance; internal environment; and occupant satisfaction. Major gaps
were often found between client and design expectations and achieved performance. Building-
related ill-health was studied and the term “sick building syndrome” coined.
Draft BRI paper on Probe process

In the early 1990s the industry was tackling these new agendas, for example using:
• computer models and advanced natural ventilation (ANV designed using simulation
methods and often operated automatically – at least to some extent) to avoid air-
conditioning;
• building fabric and glazing systems with better thermal and daylight performance;
• more efficient air-conditioning (AC), e.g. with displacement ventilation and static
cooling;
• mixed-mode (MM) solutions, combining natural with mechanical ventilation and cooling;
• a variety of plant efficiency and control improvements.

Clients were also broadening their interests: indoor environments, occupant health, productivity
of staff, and higher level strategic business issues. However, gaps - sometimes large -
between expectation and achieved performance were being reported. Hence the need for better
feedback to consolidate the benefits and to identify and correct any problems.

The Probe initiative


New buildings incorporating some of these innovations were regularly reported in BSJ –
Building Services Journal. In 1994, its editorial review panel suggested reporting on how
these worked in practice. How could a magazine attract funding for the detailed investigations
required? BSJ saw an opportunity in the Department of the Environment’s (now DETR’s)
newly-announced Partners in Technology (now Partners in Innovation) scheme. This invited
bids from industry and researchers for collaborative projects which would attract up to 50%
government funding.

A proposal was submitted, in which the government would pay for post-occupancy
investigations of technical and energy performance and occupant satisfaction in eight recently-
completed building which had been featured in BSJ. This contribution was matched by BSJ’s
editorial and publication costs. Additional input to the project was obtained by in-kind
contributions of time by the Probe survey team and, of course, the time spent by the building
occupiers and their advisers in supporting the surveys.

The proposal was successful, demonstrating how this new approach to government funding
could stimulate and support innovations in real-world research and dissemination . The project
started in mid-1995, with the following team of investigators:
• Halcrow Gilbert, who were heavily involved in applying advanced design tools to the
design of low-energy buildings.
• Building Use Studies Ltd (BUS), who monitor occupant satisfaction with buildings
through questionnaire and interview techniques.
• William Bordass Associates (WBA), who undertake and manage investigations into
technical and energy performance of buildings, and use the findings in project-specific
advice to clients and design teams, and in published guidance material.

The buildings investigated


Probe 1 investigated four AC offices, three educational buildings with ANV, and a low-energy
medical centre. Each building had been reviewed in BSJ at the time of completion and had
been occupied for not more than five years. Each was reviewed for technical performance,
energy performance, and occupant and management satisfaction. The confidential technical
reports on each building were converted by BSJ into articles of typically five or six pages.

In early 1997, funding was obtained for Probe 2. This had a broader scope, including
procurement issues, water economy, and a pressure test for airtightness. Following reader
demands for faster feedback, Probe 2 aimed to look at buildings after two years of operation
(earlier than this, they, their occupants and their management had not yet settled-down; and a
history of energy consumption was not available). Probe 2 included another eight buildings:
three offices (one AC, one MM and one NV), a naturally-ventilated (NV) and two MM
educational buildings (one with some ANV), a MM courthouse and a NV warehouse. Table 1
lists all the buildings studied in Probes 1 and 2, with their principal characteristics and the dates
of the articles in BSJ. The results of these surveys are outlined in [Probe 1999 b to d].
Draft BRI paper on Probe process

Survey methods
Probe has shown that it is possible to undertake an extended series of post-occupancy surveys
of named buildings and to publish the results in a technical journal. To our knowledge, this
had not been done before, having been regarded as too difficult and too risky. In the UK,
Probe has helped to contribute to greater better understanding of how buildings perform and to
greater openness in discussing where problems are and what measures are needed. In this way
it supports current initiatives by government, industry and clients intended to bring about
substantial improvements to the performance of the building industry and its products [DETR
1998b].

To produce reports of sufficient rigour and credibility, the methods employed needed to be
standardised, and to use existing techniques and benchmarks where possible. The backbone of
Probe has been two established tools:
• The occupant survey method developed by Building Use Studies (BUS) to gauge
occupant satisfaction with the building and its internal conditions.
• A prototype of the Energy Assessment and Reporting Method’s (EARMTM) Office
Assessment Method (OAM) for the analysis of energy use. The Probe team assisted its
development and testing. A spreadsheet version with accompanying guidance has
recently been published by [CIBSE 1999].

In the course of Probe, a comprehensive five-page standard pre-visit questionnaire (PVQ) was
also developed to help identify and collect initial information on the building and its services,
occupancy, usage and management. This was able to improve the effectiveness of the first
visit, and focus the host’s mind on the Probe team’s activities and information requirements.

The occupant survey method


The BUS occupant survey method evolved from studies undertaken to investigate sick building
syndrome during the 1980s. It was subsequently developed with and adopted by BRE [Raw
1995]. To make the self-completion questionnaire quick, easy and attractive to use in Probe, it
was neatly and appealingly designed on two A4 pages – a contrast to the original twelve-page
version, but including what experience had shown to be the most significant questions. A
shorter questionnaire was also essential to make the analysis and reporting possible within the
very limited budget for a Probe study. In practice, it has been highly successful, as it avoids
questionnaire fatigue by the recipient and information overload in the analysis, see also below.
[The survey is available under licence from Building Use Studies Ltd, +44 1904 67128].

Information was collected from permanent staff on 49 variables, in twelve groups:


• background (age, sex etc);
• the building overall (its design and how well it meets perceived needs);
• personal control (over heating, cooling, lighting etc, plus its speed of response);
• speed and effectiveness of response after complaints have been made to the management;
• temperature;
• air movement;
• air quality (in both summer and winter for the last three);
• lighting;
• noise;
• overall comfort;
• health;
• productivity at work.
A few buildings also had a handful of supplementary questions on topics of special interest to
the study team or the occupier.

The questionnaire was issued typically to a sample of 100-125 occupants; or to all occupants in
the smaller buildings. Sometimes a shorter secondary questionnaire was also given to specialist
user groups, typically students in an educational building. Staff focus groups and meetings
with management were also held when opportunities arose.
Draft BRI paper on Probe process

The BUS/Probe questionnaire is an economical compromise between the needs of respondents,


data management, data analysis, statistical validity and question-answering ability. It uses a
relatively small core set of key performance indicators (KPIs), which remain essentially the
same across all building types studied. This parsimony helps overcome the “data bloat”
problem that faces many surveys: too much data and not enough time to analyse the resulting
information properly. Where supplementary questions are used, the layout is altered so that the
questionnaire is never more than two pages long. Questions are enhanced or changed only
when absolutely necessary, as this can have potentially serious implications for cost, quality,
consistency and comparability. For example, reference information from large numbers of
previous surveys may lose its value if a question is changed or abandoned; and benchmarks
may cease to be available or need to be reworked. Changes are therefore treated circumspectly!

A relatively small core data set releases more time for managing the wider data set, making
benchmarking achievable. Nevertheless, there is still a prodigious amount of potential
information. As more buildings are added to the dataset, the burden of data management and
quality control increases. However, the larger knowledge base makes the information gained
more valuable, so early trade-offs between "must have" and "nice to know" questions become
even more important. The decision to use tried-and-tested methods to collect both energy and
occupant data, and not greatly altering these techniques as the study progressed, is a major
reason for Probe's eventual success, especially in benchmarking all-round performance.

To provide statistical snapshots of occupant responses, Probe uses two summary indexes:
• One based on comfort, see figure 1, with scores for summer and winter temperature and
air quality, lighting, noise and overall comfort.
• One on satisfaction (not shown), using ratings for design, needs, productivity and health.
These indexes are usually the first step in presenting results on a particular building. For
example, buildings may score highly for satisfaction but less well for comfort; or well on both.

Scores based on the averages of occupant responses to a particular question in each building
can also be presented on graphs with their benchmarks (here averages based on a dataset made
up from the last 50 buildings surveyed by BUS). Confidence intervals can be shown to
emphasise that they are based on sample statistics, and so subject to variations owing to sample
size, variance of responses and random fluctuations. This permits visual checks of whether the
average perception of a building differs significantly (in a statistical sense) from the benchmark,
the scale midpoint or another building. For example, figure 2 shows ratings for glare from sun
and sky for one Probe building; and figure 3 the statistics for each of the Probe buildings,
together with the benchmarks from the BUS dataset:
• If the range for a particular building is intersected by the line for the benchmark mean,
then occupant perceptions of that variable in that building are not significantly different
from the benchmark (e.g. Building 10).
• If the scale midpoint intersects the range, then the building is not significantly different
from the scale midpoint (e.g. Building 8).
• If a mean for a particular building intersects the range for another, then the buildings are
not different from each other (e.g. Buildings 4 and 7).
• If the range is large, one needs look at the distribution of individual scores to check why.
For example, people near windows may be significantly happier than those further away;
and top floors of buildings with open atria are often hotter.
Further details of the survey statistics can be examined on the website usablebuildings.co.uk.

In making comparisons, it is desirable to have statistically structured samples with requisite


"random" elements of choice. In buildings this is almost impossible, owing to the difficulty of
defining the sampling frame; and hesitant managers not allowing study teams in. The Probe
buildings are not a random statistical sample. Nor is the BUS benchmark sample, which is
based on buildings in which post-occupancy evaluations have been commissioned. These will
be self-selecting to some extent: managers who are prepared to commit resources to POEs will
also be interested in improvement, and so are likely to have better buildings already.
Draft BRI paper on Probe process

In buildings, physical design issues and human and management issues are inextricably linked:
complete separation of influencing factors is not possible. This can be compensated to some
extent by reporting results in ways that permit readers to identify particular contextual factors.
For example, most occupants prefer to have their own offices, because of the additional privacy
and control that this gives, so high scores are easier to attain in these contexts. Conversely,
occupants who are less happy with their work, managers and colleagues and may project their
dissatisfaction onto the environment and facilities; and use complaints about the physical
environment as a risk-free way of protesting about management.
The energy survey method
Energy data had to be collected and reviewed without comprehensive monitoring; although
independently monitored data was available for three of the buildings. Probe therefore intended
to use a method which had been developed by WBA when undertaking energy surveys in
offices [EEO 1994]. This was based on WBA’s own electricity consumption estimation and
reconciliation procedure, together with the London Energy Group’s energy reporting format
[London Energy Group 1983], both implemented on Excel spreadsheets.

As Probe started, DETR’s Energy Assessment and Reporting Methodology (EARM™) project
was just finishing. Amongst its products was a paper-based prototype Office Assessment
Method (OAM) which incorporated elements of the WBA procedure. Probe 1 obtained
additional funding from DETR to test this method on the four office buildings surveyed. With
as hoc modifications, the OAM approach was also used for other building types.

EARM-OAM is an iterative technique which allows one to get the best possible result within the
time available. It helps the user to seek the most significant items of missing data to bring the
picture into focus. It allows the energy performance of a building to be understood in terms of
building type, fabric, systems, occupancy and management; and includes a spreadsheet which
provides instant reconciliation between survey estimates and metered data.

OAM was developed to resolve problems which had been identified during research into the
energy assessment of buildings:
• Critical data, especially fuel consumption and the factors used for normalisation (most
importantly the floor area) were often of dubious quality. This is resolved by careful
definitions of critical data items, specifying preferred sources for the data, and including
simple data quality checks. The quality checks allow the user to carry on even if the
preferred data sources are not available: initial research showed that procedures which
insisted too specifically on data in one particular form before they provided any output
tended to put people off. Instead, the result is “flagged” as being of reduced accuracy. If
better data then becomes available, or if further investigation is justified, the user may
then return to make improvements.
• Many existing methods are either not precise enough, not sufficiently relevant, or too
demanding and time-consuming. The OAM therefore allows progressively detailed
assessments and helps the user to decide whether the conclusions are sufficiently reliable
and what further work may be required.
• The data can be re-visited by others at a later stage for review and updating.

The OAM makes extensive use of related energy performance indicators for offices from
Energy Consumption Guide 19 [DETR 1998a]. It also allows unusual features or usage to be
separately identified, helping one to move away from standard benchmarks to the most
important issues for a particular building.

Figure 5, from CIBSE TM22, is a flowchart of the process:


• Stage 1 is a broad-brush analysis against simple annual consumption indices, by fuel.
• Stage 2 considers special influencing factors, for example extended occupancy, extreme
weather, or unusual energy end-uses (in offices particularly computer rooms, which can
use a high proportion of annual electricity consumption but are seldom sub-metered).
Draft BRI paper on Probe process

• Stage 3 looks at the finer details. A key tool is an electrical consumption estimation
spreadsheet, which undertakes “load x hours” calculations by individual items, areas, or
subsystems, and reconciles this with metering data; including if necessary and where
available day/night, summer/winter and weekday/weekend load patterns. The results can
then be related to subsystem benchmarks, also making allowances for special cases.
At all stages the approach is also iterative.

OAM proved valuable in improving the speed, accuracy and consistency of the energy
assessments. Although survey time was limited, it was essential to review performance against
benchmarks for individual end-uses, so all Probe analyses proceeded into Stage 3. The paper
version of OAM tested in Probe 1 proved too time-consuming, requiring frustrating double- or
triple-entry of some data items, and was not efficient if figures subsequently changed to the
degree that the earlier stages had to be revisited. Following the pilot tests, the Probe team
found it most efficient to jump straight to the Stage 3 spreadsheet, which allows a model of
electricity consumption by end uses to be built up and reconciled with fuel bill data.

As Probe 2 was coming to an end, the authors of CIBSE TM22 gave the Probe team evaluation
copies of the Excel Workbooks on which the whole of the OAM is now implemented. These
were much quicker and more convenient to use than the former partly paper-based version.
They:
• avoid multiple entry of input data and laborious recalculation;
• provide inbuilt checks of data quality, consistency and orders of magnitude;
• show automatic numerical and graphical comparisons with selected benchmarks;
• generate “tree diagram” benchmarks [Field et al 1997] automatically for system capacity,
efficiency, operating hours and control and management factors; and
• keep track of special features and reduced accuracy items.
The workbooks also provide a clearer and more complete record of the work undertaken. This
approach can greatly improve the language of energy target-setting, assessment and reporting at
all stages of a project from briefing onwards. The technique, together with the BUS occupant
survey has also been successfully piloted on a one-visit survey to a large office of a financial
services company in the Netherlands.

Key stages of a Probe survey


With careful preparation, Probe has often been able to collect the required information in just
two site technical visits. Both one-day visits are undertaken by two experienced surveyors
who know what leads to follow, which searching questions to ask, and the priorities to set if
time is short. The second visit is by the principal technical surveyor for the job (who was also
on the first visit) and – for quality control - a second surveyor who was not. Having a new
person also makes it easier to re-visit anything which was not entirely clear on the first visit.

Some buildings have required an additional short preliminary visit to make contact, collect key
facts, check metering arrangements, and review whether a Probe is practical from both the
team’s and the host’s point of view. Others have needed an extra final visit to review findings
personally with the hosts and/or the designers; to discuss inconsistencies and uncertainties
which have arisen in preparing the final report; and to collect additional information.

Figure 6 shows the ten key stages the process, with typical timing. It takes typically two
months from the first technical survey visit to completion of the technical report, plus another
month for editing, review and publication. The central seven stages are chronological. The
timing of the other three can vary:
• Stage 6, the occupant survey. BUS does this either at or before the second visit. If
before, it raises useful questions to be pursued at the visit. If at the same time, it permits
more fruitful discussion during the visit but reduces the time for technical survey work.
• Stage 7, the energy analysis, continues throughout the project. Results come gradually
into focus as more information is collected and questions are asked.
• Stage 8, the pressure test. BRE or BSRIA normally does this once it is clear that the
Probe will be proceeding to publication; and usually shortly after the second technical
visit, when the Probe team has confirmed arrangements with the host.
Draft BRI paper on Probe process

STAGE 1. Agreement to undertake a Probe study


Probe’s initial contact is always with the occupier, who sometimes has to consult landlords,
corporate management, and property, estates and maintenance departments. Getting initial
permission was easier than independent researchers usually find: a particular surprise, given
that the results were to be openly published. The reasons appeared to have been the
experienced survey team; references from managers of buildings which had already had Probe
surveys; and most importantly the rapport developed between the editor of BSJ and the
occupier while the original reviews of the building were being researched and written.
All site visits are arranged with the occupier, not the designers. The designers also do not
come on any visits, as they would inevitably present their own perspectives and reduce
opportunities for frank discussions with users. Direct contact is also essential to extract all the
operational information required; and to allow the occupant to gain confidence in the team and
to approve and support somewhat intrusive measurements, surveys and pressure tests.

STAGE 2. PVQ: Pre-visit questionnaire (issued in Week 0)


To make efficient use of the first visit, the surveyor needs good advance information: some
understanding of the project, A3 or A4 copies of floor plans for making notes on, and
background on the operation and use of the building and its level of energy consumption.

For design information, the initial BSJ articles gave valuable background descriptions, plus
tables of key statistics. For buildings without such published information, data gathering will
take longer, unless the occupier can themselves pull together a good package of information for
the survey team. An additional preliminary site visit may then be required; plus contact with the
designers to obtain design intentions, technical, cost and procurement details, and any unusual
requirements or difficulties. This may be difficult: for several Probe buildings, the individuals
involved had already become widely dispersed and information archived.

Availability of design, operational and energy information varied greatly. In some buildings,
particularly the large air-conditioned offices, the management had facts at its fingertips. Many,
however, had poorer access to information. In practice, it was rare to receive a fully-completed
PVQ, fuel bills or floor plans before the first visit. However, even a partially-completed PVQ
helped to prepare the occupier for the demands of the survey; to structure the first interview;
and to smooth the process of identifying and obtaining missing items.

STAGE 3. The first site visit (Week 4)


This is when the work starts in earnest. The team has a lot to do in a one day or less.
Information about the building, its operation and any problems experienced is obtained from:
• an initial interview with the host (often the senior manager responsible for procuring the
building and the facilities and/or engineering manger) using the PVQ;
• an accompanied walk-round of the building and its plant ;
• informal discussion with management, operations and other staff;
• a review of specifications, drawings, O&M manuals and commissioning records;
• BMS time schedules, BMS trend logs, submeter readings; and
• office equipment inventories.

Spot measurements are taken where possible. Basic equipment always carried includes a true
power meter, light meter, temperature/RH indicator, smoke pencils and a camera. Hot wire and
vane anemometers, moisture meters, and electrical demand profile recorders are also used.
Temperature and RH logging has not been undertaken: in five of the buildings good
information was available from site management or through independent monitoring. In others
BMS records were helpful but often patchy, with little systematic trend logging.

Spot measurements also provide informal opportunities for talking to staff. For example,
taking an illuminance level leads to a conversation about lighting and glare, and quickly on to
other things, perhaps erratic control of the air-conditioning or the difficulties in getting lunch on
a business park. Such discussions have often opened-up new avenues of investigation.
Draft BRI paper on Probe process

Hosts are constrained by the pressures of their jobs, and sometimes emergencies lead to
delayed or curtailed meetings, or escorts who run off in response to emergency call-outs. A
degree of opportunism is therefore necessary to extract the most relevant information in the time
available: for example visiting the accounts department to find fuel bills; or studying plans,
specifications and manuals when one had intended to survey the plant room.

Initially the team tried to collect data on a laptop computer. This proved unsuitable, as
information on the building, its use and its systems arrived so rapidly and from so many
directions that any attempt to force it into a standard form took longer and interrupted the free
flow of the host’s comments. However, computer data collection may now be more practical
with today’s more powerful hardware and software, including the EARM-OAM workbook.

STAGE 4. Initial analysis and draft report (Weeks 4 to 7)


After the visit, all the notes and other material are digested and the energy-related data analysed.
This includes schedules of equipment loads and usage, which are reconciled with fuel
consumption data using the EARM technique, see also Stage 7.

In this stage (which takes typically four weeks including waiting time for requested data) the
text of the final report is also drafted. This draft helps to draw attention to gaps in the verbal
and numerical arguments; generate checklist items for the second visit; and provide valuable
briefing for Probe team members who have not visited the building - in particular those who
will be involved in the second site visit and the occupant survey.

STAGE 5. The second site visit (Week 8)


Before the second visit, the survey team draws up a comprehensive checklist of actions to be
taken and questions to be resolved, and makes appointments with the host staff who can
provide the information required. The team often needs help from contractors, for example an
electrician to permit meter readings to be undertaken safely; or the maintenance contractor to
answer queries on the function and performance of the BMS.

The visit gives the survey team a fresh look at the building; discuss preliminary findings with
the host; resolve uncertainties and inconsistencies and to look more deeply into specific issues.
On the energy side, a lot can be learnt from being in the building at night, to identify whether
plant switches off to programme, and what lights and office equipment are left on. Such
extended surveys have not been affordable on Probe; and increased building security over
recent years have also made night visits difficult. Indeed, there has been a general tendency for
occupants to require Probe team members to be accompanied, particularly in special areas and
plant rooms; though for general areas the requirement often relaxes as confidence builds up.

STAGE 6. The BUS Occupant Survey (variable, typically Weeks 7-9)


Arrangements for the occupant survey (e.g. timing, permissions required, sampling, and
possible site-specific questions) are discussed with the host on the first technical visit,. BUS
then makes direct contact to agree details of the procedure. BUS then send the host a draft
questionnaire for comment and approval, and finally agree the survey procedure and the date.

Permanent office staff are normally sampled. Sometimes shorter questionnaires are also given
to specialist groups, e.g. students, visitors or – in a court building - magistrates. To obtain
consistent survey results and high response rates (typically above 90%), it is essential that:
• The host contact obtains prior approval from the relevant managers.
• Staff to be surveyed are forewarned of the purpose and date of the survey.
• The survey forms are handed out personally by BUS and not circulated through the
internal mail or email. This allows the sampling to be controlled, gives opportunities for
personal discussion about the purpose of the survey and how to fill in the form, and
confirmation of the collect-up time.
Draft BRI paper on Probe process

• The forms are consistently produced, attractive to look at, and easy to fill in, with most of
the questions in seven-point "Gregory" scales with tick-boxes.
• There are spaces (but not too large!) for people to provide their personal comments on
both specific and general issues.
• The forms are distinctively coloured so they can be easily identified, both by the
respondent and by BUS when collecting them up.
• The forms are collected-up the same day. Typically they are issued mid-morning, with
the first collection before lunch and the second in the early afternoon.

Data entry takes place typically in the week after the survey, with analysis and reporting in the
week after that. High response rates are important, in order to avoid suspicions about the
statistical validity of the results and to permit analyses of sub-samples (e.g. between people
who have window seats and those who do not). Significance tests are used on all variables and
written comments are included only for items which are statistically significant.

STAGE 7. Energy analysis (ongoing throughout the project, completion Week 12)
Energy analysis proceeds throughout the survey, with usually the most intensive period
between the first and second visits. Good fuel consumption data is an essential input to the
assessment process: this came from a range of sources including monthly or quarterly invoices,
site meter readings, or directly from suppliers on paper or electronically.

For effective comparison with benchmarks, it helps to review how energy consumption varies
with weather and occupancy. Sadly, In only a few buildings could reliable profiles of monthly
gas consumption be established and the influences of climate, control and management
assessed. Gas bills often included a high proportion of estimated meter readings, and site staff
seldom read their own meters. Here it was only possible to extract an annual gas consumption
figure, estimate the climate-dependent proportion and normalise this for total annual degree
days. Occasionally some additional information was available from Transco, the company that
conveys gas about the UK but does not normally supply it to the end-user.

On the smaller sites some electricity bills were estimated too. On the larger ones, monthly
electricity bills nearly always had good consumption and load data. Full half-hourly records
are often available, but in practice it is not always easy for third parties like the Probe team to
obtain the data within the time available: and the utilities did not always accept the original
letters of authority that were sent to them.

STAGE 8. The Pressure Test (variable, typically Week 9 to 11)


The air leakage test is an aspect of building quality that can be measured objectively. This can
be salutary for architects not accustomed to their designs being subject to direct quantitative
analysis – particularly as (in common with most UK buildings) many results were poor,
sometimes even for buildings which claimed to be tight. The tests proved to be popular with
most occupiers, maybe because of their novelty. However, the test is expensive and time
consuming and has to be done out of hours, often on a Sunday. It was undertaken by BRE or
BSRIA, normally using a large fan towed behind, and powered from, a Land-Rover vehicle.

The Probe pressure tests highlighted some apparent inconsistencies in current procedures and
reporting conventions, in particular the standard quoted test pressure (25 or 50 Pa); envelope
area measurements (in particular the treatment of ground floors); and dealing with designed
natural and mechanical ventilation openings. This suggested a need for a definitive technical
guide on pressure testing non-domestic buildings, with associated standards for measurement
and reporting.
Draft BRI paper on Probe process

STAGE 9. The Probe final report (Weeks 8 to 11)


The final report is a lengthy compilation of the findings, comprising:
• A main report of typically 10,000 words with sections on introduction, building
assessment, air infiltration, operation, maintenance energy management, energy and water
consumption, occupant survey, and overview
• Energy information including EARM spreadsheets and graphics.
• A comprehensive occupant survey report.
• The pressure test report.

This technical report has three objectives:


• To provide all the information the BSJ needs to draw upon for the Probe article.
• To enable the host to check for any errors or misinterpretations.
• As a record to underpin the information and opinions expressed in the Probe articles.
It not itself intended for publication, for example containing frank (but unattributed) comments
from individual occupants, and is confidential to the team and to the occupier of the building.

STAGE 10. The BSJ Probe article (Week 12 to 14)


BSJ produces an article based on the final report. The penultimate draft is issued
simultaneously to the Probe Team, building host and building M&E designers for comment.
The final article also includes a box with comments from the designers, usually the services
engineers but occasionally the architects. Inevitably a balance has to be struck between the
Probe survey team’s desire for rigour and technical quality; and the BSJ’s need for a good,
readable article meeting their space limitations, production schedule and audience preferences.
Areas of difficulty have included:

• Editing-down the final report to an article of less than 4000 words.


• Maintaining fairness and balance in the process.
• Selective extracts by other journalists which have sensationalised Probe’s findings.

For the most part, the tension inevitable in such a process has been resolved well. Indeed, this
very tension may have helped to create the unprecedented interest in the Probe project: it is
human nature for critiques of buildings one knows by people firms one may also know to be
read more avidly than generic messages. The Probe team’s objective to provide suitable
information, messages and steers to the industry and BSJ’s need to engage the attention of its
readership supported the aims of the ultimate client, DETR.
Draft BRI paper on Probe process

Next steps for Probe


As the project proceeded, the techniques became more streamlined and the team more
experienced. However the surveys did not get any easier because more items were included
(e.g. pressure tests) and new results needed to be reviewed against a growing database of
reference material. Given the experience gained, however, further streamlining and
standardisation is possible.

The format and content of the Probe articles was discussed at a designer feedback seminar on
Probe 2 [BSJ 1999]. Participants felt that Probe articles were well read, in particular the key
design lessons summarised at the end. Proposed additional topics included:
• Transport use and emissions
• Capital and maintenance costs
• BREEAM implementation in the as-built design
• Embodied energy
• The relationships between any modelling predictions used in the design (dynamic thermal
computer modelling, CFD, and so on) and the performance of the completed building.
It would be difficult to include all these in a Probe at current budget levels, but future studies
will aim to include perhaps one extra topic from the list most relevant to the building concerned.

In addition, the Probe team intends to find ways of using feedback from Probes 1 and 2 to help
to assist the property industry, clients, and building occupiers in their efforts to improve
building performance; and to report on the outcomes of these interventions.

Can the Probe approach be applied more widely?


Is Probe a unique product of very particular circumstances, or has it helped to pave the way for
wider use of post-occupancy surveys and benchmarking. On the technical side:
• Techniques for occupant and energy surveys have been streamlined, so one can now
obtain better information more quickly, cheaply and consistently.
• Benchmarks are available, and largely published or available under licence.

For surveys being undertaken for publication:


• Hosts must not feel that their time is being wasted. Here experienced surveyors are
probably essential: they can not only work more quickly and effectively, but can also
give back useful information (e.g. on performance, benchmarks and results of other
studies), even as the survey is taking place. Significant training would be required in
order to reproduce this, including detailed apprentice-type involvement in at least three
surveys.
• Without the rapport established between the occupier and the editor of BSJ at the time the
initial descriptive articles were prepared, the success rate would have been much lower.
• The BSJ’s high level of insight, commitment, support and objectivity was unusual.

Most surveys, however will probably be done on behalf of the occupier or the building team.
Here priorities will be different from those in Probe. For example, in a published article the
world needs to be told about factors for success (e.g. perhaps in the procurement system) in
order to emulate them; and the problems (e.g. unsuitable window opening mechanisms) which
need to be avoided. Occupiers will often take successes for granted (though benchmarked
comparisons can be helpful); and can be well aware of the difficulties: they may well be more
interested in priorities, low-cost solutions, and better coping strategies.

An owner or occupier or owner might want a survey for a number of possible reasons, for
example:
• Ongoing programme of monitoring and benchmarking either by the building owners or the
design team, as part of a culture of feedback, service and continuous improvement.
• Management desire to improve environmental and energy performance.
• Response to general or specific occupant dissatisfaction or complaints, especially
concerning basic comfort, health or safety issues.
Draft BRI paper on Probe process

• Worries about performance in use from a facility management perspective, perhaps


including space efficiency, functional performance and occupant satisfaction.
• Concerns about the building not meeting its original brief or performance specification.
• Assessment of the existing situation, together with performance indicators, perceptions and
priorities in advance of alteration, refurbishment or new construction.

A culture of feedback
At a more general level, the Probe experience may help to bring routine feedback into the
briefing, design, construction, completion, operation, use and alteration of buildings. In spite
of clear needs, feedback is not a standard part of the design service. For example, while a
feedback Stage M is included in the current version of the Royal Institution of British
Architects’ Plan of Work [RIBA 1973], but it was omitted from the Standard Form of
Agreement (SFA) for the Appointment of an Architect published in 1992, apparently due to its
potential impact on professional indemnity insurance. However, the RIBA Guide to the SFA
[RIBA 1992] includes the following statement:

“Feedback, the last stage (M) in the RIBA’s model Plan of Work, is an
important but often neglected element of a commission. Much can be gained
from revisiting completed projects, and the client may also benefit from your
findings now that the building is in use. Even if you are not appointed for
stage M services, it can be valuable to keep in touch with the project.

It is all too easy to get dragged into providing services as a matter of good
will after the project has been completed. Remember that Architects are not
obliged to provide their professional services free.”

Probe has helped to expose the market to the idea of such feedback. The techniques of
collecting and presenting information have been streamlined and energy, occupant satisfaction
and air leakage performance and benchmarks have become more familiar. Professionals are
now more likely to admit and openly discuss shortcomings in systems and in-use performance,
where they and the industry need to improve. They are more able to make their clients more
aware of issues which need more careful attention than may currently be regarded as necessary
or affordable.

Routine PoEs for new projects


If the building team and their clients set out to undertake post-occupancy benchmarking as a
normal part of the follow-through on a project, procedures could be simplified and time saved.
For example:
• Occupant surveys could become routine QA measures. In this case, industry-standard
questions and published benchmarks would be particularly useful.
• The energy survey could be much quicker, since the designers would already have much
of the relevant input information. Indeed, if the same EARM methods were used to
summarise and benchmark energy consumption estimates at the design stage and to
review them afterwards, feedback of operational experience into design estimation could
also be could greatly improved.
• Systems could also be established to permit this and other feedback data (attributed or
unattributed) to be widely available, so assisting rapid continuous improvement of
building performance.
Draft BRI paper on Probe process

Buildings often have problems in the first year of operation. In other industries, these might be
regarded as routine prototype testing. However, the legal and contractual system hangs on to
the quaint idea that buildings are functionally complete when physically complete, and this
actually gets in the way of this after-sales service. Probe indicates that “sea trials” should be
seriously considered as a means of smoothing all but the simplest and most standardised of
buildings into occupancy, and making sure that problems (which may affect not only the
building but also its management) are rapidly identified and professionally dealt with. This
should be built into briefs, contracts and terms of appointment. It could include, perhaps, a
series of regular meetings, reviews of performance and energy consumption, and a contingency
budget for swift and effective remedial action. A building team may also be particularly
interested in the success (or otherwise) of some very specific aspects of their design.

PoEs for occupiers


Occupiers (and others) may also wish to have PoEs undertaken, from a single day site visit
through to a comprehensive survey, with the detail achievable from such input dependent on
the size and complexity of the buildings being studied. The particular reason will determine its
specific focus, but some of the procedures could be similar to the Probe approach. Some
parallels are outlined in Table 2.

Conclusions
Probe has used the UK Government’s Partners in Innovation funding to streamline post-
occupancy survey procedures, and to raise awareness of their results and implications through
regular press publication over a four-year period. This has helped to open-up discussion.
Greater awareness and more powerful and cost-effective techniques may help to make post-
occupancy evaluation more routine, and a powerful aid to continuous improvement of both
performance and the associated design and management benchmarks.

The UK government has recently embarked on a project – The Egan Initiative - to improve
buildings and their performance. This has received unexpectedly strong support from the
industry and its clients. To date Egan has focused more on the more efficient and cost-effective
supply of buildings and avoiding defects. This must soon extend to better in-use performance
in terms of occupant health, satisfaction and productivity, business and economic performance,
and sustainability.

Probe has permitted permit robust and insightful results to be obtained and publicised with a
limited budget. Some of the findings and their implications are discussed in the subsequent
papers. It has drawn attention to some of the factors for success, which are often related to the
processes of procuring, occupying and managing a building and the roles of all the parties
involved. The feedback has provided advanced warning of some of the problems associated
with new techniques and technologies and identified issues which need more attention. It has
also highlighted some persistent, chronic, minor problems (e.g. with the interfaces to control
systems) which receive relatively little attention but constantly frustrate the achievement of
potential levels of performance. By recognising and eliminating these in a culture of high
aspirations, routine feedback and continuous improvement, rapid improvements in all-round
performance will become possible.
Figure 1 Comfort index showing Probe buildings and BUS dataset

5.2
1
5

4.8

4.6
BUS Comfort Index

4.4 5
Probe
4.2
Dataset
4 10
3.8

3.6

3.4
15 © BUILDING USE STUDIES 1999
3.2
0 10 20 30 40 50 60 70 80 90 100
Percentile

Comfort index score 11 ALD 4.00


12 Benchmark 3.96
1 FRY 5.12 13 DMQ 3.81
2 TAN 4.73 14 CAF 3.64
3 C&G 4.66 15 APU 3.51
4 RMC 4.59 16 C&W 3.27
5 MBO 4.44
6 WMC 4.36 Based on seven variables using scale
7 HFS 4.22 1=Uncomfortable; 7=Comfortable
8 CAB 4.20
9 POR 4.17
10 CRS 4.08

© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Figure 2 Benchmark example for glare from sun and sky

Glare from 7
sun and sky

Too much

6
15
11
14
13
Scale midpoint 10 12
5 8
9

Bmk
Upper 95%
Benchmark 4
mean
Lower 95%

3
5
4 Building score
6 7
95% confidence upper
3
1 2 95% confidence lower
2

None
1 © THE PROBE TEAM 1999

Key to buildings 8. RMC


1. CRS 9. DMQ
2. ALD 10. POR
3. WMC 11. MBO
4. HFS 12 CAF
5. TAN 13. CAB
6. C&G 14. C&W
7. FRY 15. APU
Benchmark

Notes
Upper and lower ninety-five per cent confidence intervals are shown for 1) individual building means; 2) Building
Use Studies dataset benchmark for 50 buildings.
A building mean is significantly different from the benchmark mean if the mean value falls outside the interval range
for the benchmark mean. A building mean is significantly different from another building if its mean value falls out-
side the interval range for that building.

© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Figure 3 Benchmark example for glare from sun and sky (top) and overall comfort (bottom)
showing single study building and benchmarks

3.53 4.75 Probability that study building is not dif- 0.3835


ferent from benchmark
Study building 4.14
Probability that study building mean is 0.6545
not different from scale mid-point

No glare from Too much


Glare from sun and sky 1 2 3 4 5 6 7 glare from
sun and sky sun and sky
Benchmark 3.87
3.46 4.29 © Building Use Studies 1999

4.09 5.02

Study building 4.56

Unsatisfactory
Overall 1 2 3 4 5 6 7 Satisfactory
comfort
Benchmark 4.22
4.08 4.37 © Building Use Studies 1999

© 1999 BUILDING USE STUDIES, THE BUILDER GROUP, HGA, ESD, WILLIAM BORDASS ASSOCIATES
Draft BRI paper on Probe process

FIGURE 4:
THE OFFICE ASSESSMENT PROCEDURE

SOURCE: CIBSE TM22, figure 1 , page 2


Draft BRI paper on Probe process

FIGURE 5:
FLOWCHART OF THE PROBE SURVEY AND REPORTING PROCESS

STAGE 1
Initial contact by BSJ
Preliminary agreement to survey MAIN ITEMS STUDIED:
Procurement route
Design and construction
STAGE 2 Initial occupancy
Contact by survey team
Review preliminary information Occupant satisfaction
Issue PVQ: pre-visit questionnaire Management perceptions
Initiate energy analysis Energy and water consumption
Operation and management
Maintenance and reliability
STAGE 3 Controls and controllability
First site visit Design intentions
Complete details of PVQ Alterations made
Walk-round survey Benchmark comparisons
Check on-site records Strengths and weaknesses
Confirm energy data availability
Seek approval to occupant survey, Key messages
pressure text, metering etc.

STAGE 4
Initial analysis
Additional information Review all information
requested from occupier, Draft descriptive report
contractors and utilities Do preliminary calculations
Identify outstanding items
Checklist for second visit

STAGE 6 STAGE 5 STAGE 7


BUS Occupant survey Second site visit EARM™ energy analysis
Questionnaires and interviews Confirm messages and details plus benchmark comparison

STAGE 8 STAGE 9
Pressure test Probe final report
by BRE or BSRIA Analysis and key messages

COMMENTS STAGE 10 COMMENTS


FROM Article for publication FROM
DESIGN TEAM including BSJ graphics BUILDING OCCUPIER
Probe team final comments

Published article in
Building Services - the
CIBSE Journal

Reference data on achieved Improved industry Agenda items for clients,


performance for practice and building occupiers, professionals,
benchmarking etc. performance research and goverment.
Probe TR99 site list 13/04/99 4:32 pm

TABLE 1: THE BUILDINGS INVESTIGATED IN PROBE

PROBE 1 Buildings Investigated


Sequence
Full name Location Site Short name 3-letter Type Gp HVAC Article No
1 Tanfield House Edinburgh IC Tanfield TAN Large administrative centre O AC/(MM) Sep-95 1
2 1 Aldermanbury Square London CC Aldermanbury ALD UK head office (speculative) O AC Dec-95 2
3 Cheltenham & Gloucester Gloucester BP C&G C&G Large head office O AC Feb-96 3
4 de Montfort Queens Building Leicester IC de Montfort DMQ University teaching E ANV Apr-96 4
5 Cable & Wireless Coventry BP C&W C&W Company training college M ANV/NV Jun-96 5
6 Woodhouse Medical Centre Sheffield IC Woodhouse WMC Medical surgeries M NV/(MM) Aug-96 6
7 HFS Gardner House Harrogate BP HFS HFS Principal office O AC Oct-96 7
8 APU Queens Building Chelmsford IC APU APU Learning Resource Centre E ANV Dec-96 8

PROBE 2 Buildings Investigated


Sequence
Full name Location Short name 3-letter Type HVAC Article No
9 John Cabot CTC Bristol IC Cabot CAB Secondary education E NV/ANV Oct-97 11
10 Rotherham Magistrates Courts Rotherham IC RMC RMC Courtrooms and offices M MM Dec-97 12
11 Charities Aid Foundation Kent BP CAF CAF Principal office (pre-let) O MM Feb-98 13
12 Elizabeth Fry Building Norwich UC Elizabeth Fry FRY University teaching E MM Apr-98 14
13 Marston Books Office Abingdon BP MB Office MBO Principal office (pre-let) O NV/(ANV) Aug-98 16
14 Marston Books Warehouse Abingdon BP MB Warehouse MBW Warehouse (pre-let) M NV Aug-98 16
15 Co-operative Retail Services Rochdale BP CRS CRS Large head office O AC/(MM) Oct-98 17
16 The Portland Building Portsmouth IC Portland POR University teaching E ANV/MM Jan-99 18
Site: BP=Business Park or similar; CC=City Centre; IC=Inner City; UC=University Campus © THE PROBE TEAM 1999
Group: E= Educational; M=Miscellaneous; O=Office
HVAC: AC=Air Conditioned; NV=Naturally ventilated; ANV=Advanced NV; MM=Mixed Mode (bracketed if minor influence)
Draft BRI paper on Probe process

TABLE 2: RELEVANCE OF PROBE TECHNIQUES TO A POST-


OCCUPANCY EVALUATION FOR A BUILDING OWNER OR OCCUPIER
Step Probe process Relevance and implications
1 Seek and obtain Hosts may have difficulty dealing with unwelcome findings.
access for study They may also wish to include space utilisation, cost and some
aspects of aesthetic performance. These can significantly increase
the difficulty of carrying out the project successfully.
2 PVQ A detailed POE may need a preliminary visit to collect information
sought by the PVQ, particularly for a large or complex building
For a minimal POE, it will be essential to receive a completed PVQ
with energy data and floor plans prior to the sole site visit.
3 First site visit As Probe, but if the building has not been well written-up, more
work may well be required to confirm its detailed nature, its
services and how it is operated.
For minimal POEs, this may be the only site visit, so it is crucial
for all required host staff to be available to the POE team.
Comprehensive checklists will be required to ensure that essential
minimum information is not overlooked.
4 Draft report Occupants may be looking for immediate advice on how they
stand in relation to peers and benchmarks; whether they have
specific problems or benefits, and what they have scope to do.
5 Second site visit As Probe, but with more emphasis on looking for measures to be
recommended, rather than compliance with design intent.
6 Occupant survey Sometimes perceived as an optional extra, but in practice this may
often be the driver. Extra questions may be needed on workplace
performance (e.g. cleaning, fitness for purpose), costs-in-use,
space utilisation. Occupant surveys can try to cover too much,
often leading to bloated questionnaires, low response rates and
lengthy data analysis times. It may be difficult to interpret the
results of questions for which there are no available benchmarks.
Focus groups are a good supplement to a basic questionnaire,
especially if retrospective work is being carried out on briefing,
design process and design quality. The constraints of Probe have
encouraged a wide-ranging occupant survey to be undertaken,
analysed, reported and benchmarked much more quickly and
cheaply than was possible beforehand.
7 Energy survey Extent of EARM analysis will range from:
• For minimal POEs: Stage 1 with perhaps some initial tree
diagram analysis for typical systems, for example perhaps
office lighting and main air conditioning fans.
• For a comprehensive POE, a full stage 3 analysis
8 Pressure test Optional extra. Few occupiers would probably want to pay for a
routine pressure test, but if an airtightness problem was identified
(through complaints or high fuel consumption) then it would
become a higher priority. However, much air leakage can be
readily identified using hand-held smoke pencils.
9 Report The length and detail of report will depend on the extent of the
survey. It should include benchmark comparisons and then move
to site specific measures that can be undertaken, typically split into
no, low and medium capital cost categories, and with approximate
estimates of costs and benefits of each measure. Detailed
specification of measures would tend to be a separate exercise
following decisions by the client.
10 Probe article Generally not applicable, though public domain reporting
(attributed or not) in relation to benchmarks could be very
valuable, provided the context was clearly defined.

You might also like