Reservoir Characterization

Download as pdf or txt
Download as pdf or txt
You are on page 1of 15

Reservoir Characterization

Using Expert Knowledge, Data and Statistics


Prepared with assistance from:
Peter Corvi, Head of Effective Properties
Kes Heffer, Head of Reservoir Description
Peter King, Senior Reservoir Engineer,
Effective Properties
Stephen Tyson, Computing Consultant
Georges Verly, Senior Geostatistician,
Effective Properties
BP Research
Sunbury-on-Thames, England
Christine Ehlig-Economides, Reservoir
Dynamics Program Leader
Isabelle Le Nir, Geologist
Shuki Ronen, Geophysics Integration
Program Leader
Phil Schultz, Department Head
Etudes et Productions Schlumberger
Reservoir Characterization Department
Clamart, France
Patrick Corbett, Research Associate
Jonathan Lewis, Conoco Lecturer in Geology
Gillian Pickup, Research Associate
Phillip Ringrose, Senior Research Associate
Heriot-Watt University
Department of Petroleum Engineering
Edinburgh, Scotland
Dominique Gurillot, Sr. Research Engineer,
Reservoir Engineering Division
Lucien Montadert, Director of Exploration
and Production
Christian Ravenne, Principal Research Associate,
Geology and Chemistry Division
Institut Franais du Ptrole
Rueil-Malmaison, France
Helge Haldorsen, Assistant Director, International
Field Developments
Norsk Hydro A.S.
Oslo, Norway
Thomas Hewett, Professor of Petroleum
Engineering
Stanford University
California, USA

January 1992

Accurately simulating field performance requires knowing properties


like porosity and permeability throughout the reservoir. Yet wells providing the majority of data may occupy only a billionth of the total reservoir volume. Transforming this paucity of data into a geologic model for
simulating fluid flow remains an industry priority.
The simplest representation of a reservoir is
a homogeneous tank with uniform properties throughout. Characterizing such a simplified model requires merely enough data
to define its volume and basic petrophysical
properties. This information can then be
used to help forecast reservoir behavior and
compare possible production scenarios.
But efficient exploitation of reserves
requires a more sophisticated approach.
Reservoirs have been created by complex
sedimentary and diagenetic processes, and
modified by a history of tectonic change.
Rather than resembling simple tanks, reservoirs are heterogeneous structures at every
scale. The need for accurate characterization is vital for oil company economic planning (next page, top ).
In the past, geologists have used expert
knowledge to interpolate between wells and
produce basic reservoir models. However,
during the last ten years, this effort has been
transformed by statistical modeling offering
insight into the effects of heterogeneity.
This article details how reservoir descriptions are developed and is divided into
three stages (page 27 ):
Defining the reservoirs large-scale structure using deterministic data
Defining the small-scale structure using
statistical techniquesgeostatistics
Rescaling the detailed geologic model to
be suitable input for a fluid-flow simulator.

Cray is a mark of Cray Research Inc. MicroVAX is a mark


of Digital Equipment Corp.
1. Weber KJ: How Heterogeneity Affects Oil Recovery,
in Lake LW and Carroll HB Jr (eds): Reservoir Characterization. Orlando, Florida, USA: Academic Press
Inc. (1986): 487-544.

Defining the Large-Scale Structure

The first stage aims to construct as detailed a


geologic description of the reservoir as measurements will allow. The available information includes a structural interpretation from
seismic data, static data from cores and logs
and dynamic production data from testing.
To these, the geologist adds expert knowledge on geologic structures, obtained from
field experience and outcrop studies (see
Gathering Outcrop Data, page 28).
The first step is to recognize depositional
units and then, if possible, correlate them
between wells. It helps to know the general
shape and order of deposition of the units,
and ideally the relationships between width,
thickness and length for each unit.1 Well-towell correlation is made using traditional
hard-copy maps and logs or, today, interactive computer workstations. Workstations
allowing simultaneous visualization of seismic, log and other data enable easier recognition of subtle correlations.
In most cases, this large-scale characterization results in a conventional layer-cake
modela stack of subhomogeneous layers.
There is no doubt that many reservoirs
should be characterized by more complex
arrangements of depositional unitssuch as
the jigsaw and labyrinth models (next
page, bottom ). 2 The ability to use these
more complex models routinely remains an
industry goal (see Trends in Reservoir Management, page 8 ).
2. Weber KJ and van Geuns LC: Framework For Constructing Clastic Reservoir Simulation Models, Journal of Petroleum Technology 42 (October 1990):
1248-1253, 1296-1297.

25

Hypothetical well data

Defining the Small-Scale Structure Using


Geostatistical Modeling

Heterogeneity case A

Heterogeneity case B

1
2
3
4
5
Shale Barrier

Predicted oil recovery


PV recovered

Lithology

0.5

Case B
Case A

0.4
0.3
0.2
0.1
0

0.2

0.4

0.6

0.8

1.0

Pore volume (PV) injected

nHow heterogeneity affects productivity. In this example developed at BP Research,


Sunbury-on-Thames, England, two different models have been interpolated between a
pair of hypothetical wells. The lithologies are identical: only their spatial correlation
has been altered. The graph shows how the pore volume recovered versus pore volume
injected varies for the two models. After one pore volume has been injected, the difference in recovery between the two models is about 0.1 of a pore volume.

Transgressive deposit

Barrier bar
Barrier foot

Tidal channel

Crevasse splay
Barrier bar
Distributary channel fill

Once the shape of the reservoir and its


large-scale structure have been described,
the next stage focuses on defining heterogeneity within each depositional unit. This
requires more detailed interwell measurements than are currently available. Borehole
seismic data typically cannot span wells.
Although surface and cross-well seismic
data do reach everywhere, there is insufficient knowledge relating seismic data to
physical rock properties, and exploration
seismic data have insufficient resolution. In
addition, well testing data often lack directional information. Therefore nondeterministic, or geostatistical, methods are required.3
Large-scale reservoir models typically
comprise a few thousand grid blocks, each
measuring about 100 m square and more
than 1-m thick [about 1000 ft2 by 3 ft] But
models that take into account small-scale
heterogeneity use smaller grid blocks1
million or more are often needed. The sheer
quantity of input data required to fill so
many blocks also favors geostatistics.
Geostatistical techniques designed to provide missing data can be classified in two
ways: by the allowable variation of the
property at a given point and by the way
data are organized. There are two classes of
property variation, continuous and discrete.
Continuous models are suited to properties
like permeability, porosity, residual saturation and seismic velocity that can take any
value. Discrete models are suited to geologic properties, like lithology that can be
represented by one of a few possibilities.4
Data organization has developed along
two paths depending on how the models
are built up. One is grid based. All the
properties are represented as numbers on a
grid, which is then used for fluid-flow simulation. The second is object based. Reservoir features such as shales or sands are
generated in space and a grid then superimposed on them.
Grid-Based Modeling

Layer cake

Jigsaw

Labyrinth

nDefining reservoir type. Current large-scale characterization tends to result in a simple layer-cake model. However, efforts are being made to use more complex models
like the jigsaw and labyrinth models shown.

26

One of the key geostatistical tools used in


grid-based modeling is krigingnamed after
a pioneer of rock-property estimation, D. G.
Krige, who made a series of empirical studies in the South African goldfields.5 Kriging
is designed to honor measured data and
produce spatially correlated heterogeneity.
The heart of the technique is a two-point
statistical function called a variogram that
describes the increasing difference (or
decreasing correlation) between sample values as separation between them increases.

Oilfield Review

Core plugs
Whole core
Well logs
Kriging minimizes the error between the
interpolated value and the actual (but
unknown) value of the property.6
For a given property, a variogram is constructed from pairs of data generally measured in wells. In the oilfield, criticism of
kriging centers on the low density of well
data when compared to mining. To overBorehole geophysics
Outcrop studies
come this problem, variograms can alternatively be constructed from data measured
on outcrops or in mature fields where better
sampling is available.
Although variograms are commonly used
to infer the spatial continuity of a single
variable, the same approach can be used to
Geologists
study the cross-continuity of several differexpert knowledge
ent variablesfor example, the porosity at
one location can be compared to seismic
transit time. Once constructed, this crosscorrelation can be used in a multivariate
regression known as cokriging. In this way,
a fieldwide map of porosity can be computed using not only porosity data, but also
the more abundant seismic data.7
Kriging and cokriging deal with quantitative, or hard, data. A third method called Stage 1: Defining large-scale structure
soft kriging combines expert information,
also called soft data, with the quantitative
data. The expert data are encoded in the
form of inequalities or probability distributions. For example, in mapping a gas/oil
contact (GOC), if the contact is not reached
in a certain well, soft kriging uses the
inequality: GOC is greater than well total Stage 2: Defining small-scale structure
depth. At any given point where there is no
well, an expert may define the probability of
finding the GOC within a certain depth
interval. Soft kriging will use this probability.8
(continued on page 30 )

January 1992

Surface seismics

Stage 3: Scaling up

30,000
grid blocks

1 million
grid blocks

outflow face

Fluid-flow simulation
inflow face

3. Haldorsen HH and Damsleth E: Stochastic modeling, Journal of Petroleum Technology 42 (April


1990): 404 -412. Haldorsen HH and Damsleth E:
Challenges in Reservoir Characterization Research,
presented at Advances in Reservoir Technology, Characterization, Modelling & Management, organized by
the Petroleum Science and Technology Institute, Edinburgh, Scotland, February 21-22, 1991.
4. Journel AG and Alabert GF: New Method For Reservoir Mapping, Journal of Petroleum Technology 42
(February 1990): 212-218.
5. Krige DG: A Statistical Analysis of Some of the Borehole Values of the Orange Free State Goldfield, Journal of the Chemical, Metallurgical and Mining Society
of South Africa 53 (1952): 47-70.
6. Journel AG: Geostatistics for Reservoir Characterization, paper SPE 20750, presented at the 65th SPE
Annual Technical Conference and Exhibition, New
Orleans, Louisiana, USA, September 23-26, 1990.
7. Doyen PM: Porosity From Seismic Data: A Geostatistical Approach, Geophysics 53 (October 1988):
1263-1275.
8. Kostov C and Journel AG: Coding and Extrapolating
Expert Information for Reservoir Description, in Lake
LW and Carroll HB (eds): Reservoir Characterization.
Orlando, Florida, USA: Academic Press Inc. (1986):
249-264.

Well testing

nBuilding a reservoir model in three stages with measured data, expert knowledge
and statistics.

27

Gathering Outcrop Data

Every reservoir is unique. Yet recurring depositional conditions create families of reservoirs
that are broadly similar. If reservoir rock outcrops
at the surface, it presents a golden opportunity to
gather data which can be used to help characterize related subsurface formations.1
Outcrop studies allow sampling at scales that
match interwell spacing. Small-scale permeability can also be measured and compared with
depositional characteristics. Then the statistical
and depositional information measured at the
outcrop can be incorporated into the stochastic
modeling of the subsurface analog.2
Generally, there are two kinds of outcrop
study.3 Bed or bed-set scale genetic models
(GEMs) concentrate on reservoir heterogeneity as

nPhotomosaic of the Tensleep Sandstone in the Big Horn basin, Wyoming, USA. This
Pennsylvanian-age sandstone is being studied because of its similarity to the Rotliegendes Sandstone, an important gas reservoir unit found in the Southern Basin of the North
Sea. Like most outcrops, it is irregularly shaped. Therefore, to obtain the mosaic with
constant scale and no distortion, overlapping photographs were taken from points in a
plane parallel to a major depositional surface on the outcrop. With a helicopter positioned in this plane, laser range finders were used to maintain a fixed distance from
the outcrop to ensure constant scale. This was double-checked by positioning scale
bars at regular intervals on the outcrop. The mosaic was prepared during studies by
Jon Lewis and Kjell Rosvoll at Imperial College, University of London, England and
Heriot-Watt University, Edinburgh, Scotland with the support of Conoco (UK) Ltd.

a function of the sedimentary process. To date,

outcrop (above). The mosaic can be digitized to

probe permeameters (next page, below).6 These

most outcrop studies have been of this type. The

quantify lithofacies patterns. It can also be used

evaluate permeabilityfrom 0.5 millidarcies

second type, a field analog model (FAM), looks at

to plan and execute small-scale petrophysical

(md) to 15 darciesby injecting nitrogen into a

heterogeneity caused not only by depositional

measurements on the outcrop.

prepared location on the outcrop. Thousands of

processes but also by diagenesis. FAMs present

Gamma ray logging of outcrops is used to help

nondestructive measurements can be made in

two difficulties. First, it is hard, given the paucity

describe the composition of the lithofacies. A

situ and the results compared to traditional labo-

of reservoir data, to decide which diagenetic pro-

standard truck-mounted sonde can be lowered

ratory tests for quality control. This permits

cesses are relevant. Second, finding outcrops

down the face of an outcrop and gamma ray mea-

examination of permeability changes on a very

that have undergone specific diagenetic pro-

surements continuously recorded. Although

small-scaleon the order of millimeters.

cesses can be difficult.

washouts are known to affect borehole gamma

Through such measurements, permeability has

ray logging, tests show that the sonde can be as

been shown to vary dramatically within a deposi-

skill learned by most geologists, in the past they

much as 0.6 m [2 ft] away from the rock before

tional unit.7

have concentrated on such goals as describing

the reading is compromised.5

Although the study of outcrops is a traditional

the sedimentological environment and the

An alternative technique employs a lightweight,

A typical study, carried out by the University of


Texas, Bureau of Economic Geology, Austin,

geometries of the major depositional units. For

portable gamma ray spectrometer to measure

Texas, USA, centers on a Ferron Sandstone out-

reservoir characterization, a much more detailed

either total radiation or individual radioelement

crop in central Utah, USA. Ferron is a wave-mod-

and quantitative picture of the heterogeneity

concentrations. Total radiation measurements

ified, deltaic sandstone with permeability distri-

within depositional sequences is required.

are typically made at 0.65-m [2.1-ft] intervals.

bution strongly related to lithofacies type and

Selected intervals may be logged in greater

grain size. It is believed to be an analog for Gulf

tional mapping skills, new analytical techniques

detail if required. Because the tool measures an

Coast deltaic reservoirs, which account for 64%

have been devised. Analyzing photographs of the

area of the rock face with a diameter of about 0.3

of Texas Gulf Coast gas production.8

outcrop helps clarify the lithofacies distribution.4

m [1 ft], readings taken for thin beds will be

Clearly, more than one photograph is needed,

slightly influenced by adjacent strata.

Therefore, in addition to the geologists tradi-

and this technique can succeed only if all the

One of the principal pitfalls in reservoir charac-

First, the depositional framework of the outcrop was established using color photomosaics to
map the distribution and interrelations of the

photographs are at the same scale and taken at

terization stems from the use of unrepresentative

sandstone components. This involves determin-

the same distance from the rock. A helicopter-

permeability data measured from cores and core

ing lithofacies architecture, delineating perme-

mounted camera with a laser range finder is one

plugs taken from only a few wells. Densely sam-

able zones and their continuity, identifying flow

way of achieving this. Successive photographs

pled outcrop permeability data are of key impor-

barriers and baffles and establishing permeabil-

are taken with a 30 to 40% overlap and then com-

tance. These can be gathered by performing labo-

ity trends.

bined to construct a photomosaic of the whole

ratory-based flow tests on samples gathered in


the field. However, a comparatively recent devel-

On the basis of this framework, more than 4000


minipermeameter readings were made, all on

opment has seen the introduction of portable


electronic minipermeametersalso known as

28

Oilfield Review

Variogram analysis confirmed that the hetero-

vertical exposures of rock to minimize the effects


of weathering. Two sampling strategies were

geneity of the permeability was structured in a

employed. First, more than 100 vertical strips

way that depended on the rock type sampled and

were mapped every 30 to 60 m [100 to 200 ft] on

the sampling scale. For instance, at sampling

three channel complexes. Permeability measure-

intervals of 0.6 m, the range over which perme-

ments were taken at 0.3-m spacing on each strip.

ability shows correlation was 5.5 m [18 ft] for

Second, two sampling grids were constructed

tough-cross-bedded sandstone and 1.2 m [4 ft]

to examine the small-scale spatial variability of

for contorted beds. Planar cross-bedded strata

permeability within lithofacies. One was 12-m

exhibited a range of 3.6 m [12 ft] for a 0.6 m

[40-ft] square with 0.6-m subdivisions. The sec-

sampling interval and 0.6 m for the 0.1 m inter-

ond, inside the first, measured 1.8-m [6-ft]

val. According to the University of Texas

square with 0.1-m [0.3 ft] subdivisions.

researchers, this suggests that the permeability

The study successfully identified patterns of

structure may be fractal.


This is just one example of work underway.

heterogeneity. Delta-front sandstones were found


to be vertically heterogeneous but laterally con-

Over the past five years, there has been an explo-

tinuous over a typical well spacing. However, the

sion of outcrop studiesa hundred or more liter-

sand belts of the distribution systemcomposed

ature references (above, right). And many pro-

of amalgamated, lateral accretion point-bar

jects are afoot to bring these studies together and

sandstoneswere found to have both lateral and

create outcrop data bases. Key information

vertical heterogeneity.

needed to condition geostatistical modeling will

nGoing undergroundoutcrops are not


always on the surface. A three-dimensional (3D) study of permeability variation was undertaken in a subsurface silica sand mine in Morvern, Scotland. Some
8000 permeability measurements have
been collected over a 1-km [0.6-mile]
square section of the mine. The work was
carried out by Jon Lewis and Ben Lowden
at Imperial College, University of London,
England, and supported by Den Norske
Stats Oljeselskap A.S. (Statoil).

become readily available.

Gas supply
(nitrogen cylinder)

R1

R2
Four way valve

Four way valve

Flow units

Four way valve


Gas injection probe
Rock

Injection pressure
measurement system

Pressure
transducer (0-20psi)

January 1992

nMinipermeameter in detail. The


probe is applied to
the cleaned surface of the rock
with enough pressure to ensure a
good seal. Nitrogen is injected into
the rock and the
pressure and flow
rate measured.
These values,
along with the
area of the probe,
are then used to
calculate permeability. To measure
the range of permeabilities found
at the outcrop, four
different flow elements are required.

1. Weber KJ and van Geuns LC: Framework for Constructing Clastic Reservoir Simulation Models, Journal of
Petroleum Technology (October 1990): 1248-1253, 12961297.
2. North CP: Inter-Well Geological Modelling of Continental
Sediments: Lessons From the Outcrop, presented at
Advances in Reservoir Technology, Characterisation,
Modelling & Management, organized by the Petroleum
Science and Technology Institute, Edinburgh, Scotland,
February 21-22, 1991.
3. Lewis JJM: A Methodology for the Development of Spatial Reservoir Parameter Databases at Outcrop, presented at Minipermeametry in Reservoir Studies, organized by the Petroleum Science and Technology Institute,
Edinburgh, Scotland, June 27, 1991.
4. A lithofacies is a mappable subdivision of a stratigraphic
unit distinguished by its lithology.
5. Jordan DW, Slatt RM, DAgostino A and Gillespie RH:
Outcrop Gamma Ray Logging: Truck-Mounted and
Hand-Held Scintillometer Methods Are Useful for Exploration, Development, and Training Purposes, paper SPE
22747, presented at the 66th SPE Annual Technical Conference and Exhibition, Dallas, Texas, USA, October 6-9,
1991.
6. Lewis JJM: Outcrop-Derived Quantitative Models of Permeability Heterogeneity for Genetically Different Sand
Bodies, paper SPE 18153, presented at the 63rd SPE
Annual Technical Conference and Exhibition, Houston,
Texas, USA, October 2-5, 1988.
Jensen JL and Corbett PWM: A Stochastic Model for
Comparing Probe Permeameter and Core Plug Measurements, paper 3RC-24, presented at the 3rd International
Reservoir Characterization Technical Conference of the
National Institute for Petroleum and Energy Research and
the US Department of Energy, Tulsa, Oklahoma, USA,
November 3-5, 1991.
7. Kittridge MG, Lake LW, Lucia FJ and Fogg GE:
Outcrop/Subsurface Comparisons of Heterogeneity in
the San Andres Formation, SPE Formation Evaluation 5
(September 1990): 233-240.
8. Tyler N, Barton MD and Finley RJ: Outcrop Characterization of Flow Unit and Seal Properties and Geometries,
Ferron Sandstone, Utah, paper SPE 22670, presented at
the 66th SPE Annual Technical Conference and Exhibition,
Dallas, Texas, USA, October 6-9, 1991.

30

Measured data

Value

Simulation 1
Kriged
interpolation
Actual data
Simulation 2
Distance

nImitating realitys variability. The kriging technique yields a


smooth interpolation, ignoring natures true small-scale heterogeneity. By adding noise to a kriged interpolation, stochastic modeling produces more lifelike realizations.
Realization A

0
20
40
60

Depth, ft

These kriging methods yield smooth interpolations, but do not describe small-scale
heterogeneity. In a process called stochastic
modeling that superimposes correlated
noise onto smooth interpolations, more realistic pictures emerge. A probability distribution determines how this noise is generated.
A number of pictures, or realizations, will
usually be created, each with different noise
sampled from the same distribution (top,
right ). By analyzing many realizations, the
extent to which geologic uncertainty affects
reservoir performance can be studied (middle, right ).
An example is sequential Gaussian simulation (SGS). The data are constructed grid
block by grid block. At the first selected
block, the smooth interpolated value is calculated by kriging using the available measured data. The interpolated value and its
variance, also calculated by kriging, define
a Gaussian distribution function from which
a noise value is randomly drawn and added
to the interpolated value. For the next point,
selected at random, the process is repeated,
using as a base this newly derived data
point together with the measured data. As
the grid blocks are filled, all previously calculated values contribute to computing the
next in the sequence.
The technique depends on the reservoir
property being normally distributed, so a
property like permeability, which has a
skewed distribution, requires transformation
to normality. The reverse process has to be
performed once all the grid blocks are filled.
An increasingly popular method for generating noise for stochastic modeling uses
the concept of fractalsa statistical technique that produces remarkably realistic
imitations of nature. Fractal objects exhibit
similar variations at all scales of observation. Every attempt to divide fractal objects
into smaller regions results in ever more
similarly-structured detail (right ).9 This simplifies stochastic modeling. The variogram is
defined from a single numberthe fractal
dimension, calculated from measured data
in the reservoir or outcrops. And because
fractals are self-similar, the variance of the
noise need be determined only at a single
scale. Fractal modeling has been used to
predict the production performance of sev-

80

Realization B
0
20
40
60
80
0

200

400

500

600

800

Distance, ft
Above 100 md
1000 md

nTwo stochastic
realizations of reservoir permeability
showing how different noise added to
the same kriged
interpolation gives
quite different realizations. In both the
horizontal and vertical directions,
realization A has
relatively high continuity whereas
realization B has
relatively low continuity. Continuity is
gauged by measur1000
1200
1400 ing the size of groups
of contiguous blocks
with
permeability of
1.00.1 md
100 millidarcies
(md) or more.
Below 0.1 md

101 md
After Fogg GE, Lucia FJ and Senger RK: Stochastic Simulation of Interwell-Scale Heterogeneity for Improved Prediction of Sweep Efficiency in a Carbonate Reservoir, in
Lake LW, Carroll HB Jr and Wesson TC (eds): Reservoir Characterization II. San Diego,
California, USA: Academic Press Inc. (1991): 355-381.

nThe Sierpinski Gasket. This self-similar fractal structure was devised by the
Polish mathematician W. Sierpinski
about 90 years ago. It is formed by
dividing the largest triangle into
smaller triangles with sides half as
long as the original. In three of the
resulting four triangles the process is
repeated. The process is repeated at
progressively finer scales. At every
scale the same patterns can be found.
Random fractals generated by a similar process, but with added stochastic
variations, produce remarkably realistic simulations of geologic variability.

Oilfield Review

eral large fields. Recent studies have also


applied fractal models to investigate the performance of miscible gas floods.10
Discrete data require a special interpolation technique called indicator kriging. This
can be combined with stochastic modeling
to form a process called sequential indicator
simulation (SIS). Take a model where either
sand or shale must be chosenthere is no
middle ground. At the same time, a random
element must be introduced to obtain multiple realizations. Consider building a model
where sand is 1 and shale 0. Indicator kriging will assign every grid block some value
between 0 and 1giving an indication of
the likely lithology. For example, a value of
0.7 indicates a 70% chance of sand. In the
stochastic stage, this percentage is then used
to weight the random choice of sand or
shale for the grid block.11
Object-Based Modeling

In object-based modeling, a picture of the


reservoir model is built from generic
unitsfor example, sand bodies or shale
barriers, each unit having uniform internal
properties. These models are built by starting with a uniform background matrix and
adding units with a contrasting property:
either a high permeability background to
which shale barriers are added, or a low
permeability background to which sand
bodies are added.
When starting with a high-permeability
background, the vertical distribution of

shales may be inferred from cores and logs,


particularly gamma ray logs. But unless the
well spacing is extremely dense, nothing is
revealed about the shales lateral dimensions. Some shales correlate from well to
well, but most do not, and their lateral extent
must be generated statistically.
These stochastic shales are drawn at random from a size distribution and placed
randomly until the precalculated shale density has been achieved. Shale density is estimated from the cumulative feet of shale
measured in wells compared with the gross
pay, and then assumed to represent the
reservoir volume under study. To ensure that
the realizations honor the deterministic
data, shales observed in wells are placed in
the model first, their lateral extent still determined randomly. Subsequent, randomly
placed shales that intersect a well are
rejected (below ).12
To generate these models, the key statistics specified by the geologist are shale
length and width, together with some guidance as to their interdependence. Traditionally, the geologist describes shales qualitatively as wide, extensive or lenticulartoo
vague for building realistic stochastic models. Instead, ranges of length and width are
required, along with gross pay thickness and
average shale density. The depositional
environment has great bearing on these values and the primary sources of this information are outcrop studies.13

9. Hewett TA: Fractal Distributions of Reservoir Heterogeneity and Their Influence on Fluid Transport,
paper SPE 15386, presented at the 61st SPE Annual
Technical Conference and Exhibition, New Orleans,
Louisiana, USA, October 5-8, 1986.
Hewett TA and Behrens RA: Conditional Simulation
of Reservoir Heterogeneity With Fractals, paper SPE
18326, presented at the 63rd SPE Annual Technical
Conference and Exhibition, Houston, Texas, USA,
October 2-5, 1988.
Fractals and Rocks, The Technical Review 36, no.
1 (January 1988): 32-36.
Crane SD and Tubman KM: Reservoir Variability
and Modeling With Fractals, paper SPE 20606, presented at the 65th SPE Annual Technical Conference
and Exhibition, New Orleans, Louisiana, USA,
September 23-26, 1990.
10. Perez G and Chopra AK: Evaluation of Fractal Models to Describe Reservoir Heterogeneity and Performance, paper SPE 22694, presented at the 66th SPE
Annual Technical Conference and Exhibition, Dallas, Texas, USA, October 6-9, 1991.
Payne DV, Edwards KA and Emanuel AS: Examples
of Reservoir Simulation Studies Utilizing Geostatistical Models of Reservoir Heterogeneity, in Lake LW,
Carroll HB Jr and Wesson TC (eds): Reservoir Characterization II. San Diego, California, USA: Academic Press Inc. (1991): 497-523.
11. Journel AG and Alabert FG: Focusing on Spatial
Connectivity of Extreme-Valued Attributes: Stochastic Indicator Models of Reservoir Heterogeneities,
paper SPE 18324, presented at the 63rd SPE Annual
Technical Conference and Exhibition, Houston,
Texas, USA, October 2-5, 1988.
12. Haldorsen HH and Chang DM: Notes on Stochastic
Shales; From Outcrop to Simulation Model, in Lake
LW and Carroll HB Jr (eds): Reservoir Characterization. Orlando, Florida, USA: Academic Press Inc.
(1986): 445-486.
13. Geehan GW, Lawton TF, Sakurai S, Klob H, Clifton
TR, Inman KF and Nitzberg KE: Geologic Prediction of Shale Continuity, Prudhoe Bay Field, in Lake
LW and Carroll HB Jr (eds): Reservoir Characterization. Orlando, Florida, USA: Academic Press Inc.
(1986): 63-82.

nStochastic shales in object-based modeling. The location of


each stochastic shale is random and independent of other
shales. The units are drawn randomly from a size distribution
function and placed at random locations until the required shale
density has been achieved.

January 1992

31

nBuilding an object-based model using SIRCH, a software package developed at BP Exploration. In this model, two different
types of fluvial channel belt have been generated in a threestage process. First, one type of channel belt is generated (yellow). Then a second set (purple) is added to the first. This second
type has associated overbank deposits (red)which are thinner,
poorer quality sands that nevertheless improve connectivity.
Finally, the combined channel belts are faulted.
Each channel belt is created using about 50 properties. Some
are constants, some are picked at random from input distributions and some are calculated from other properties. Properties
that determine channel shape include width, thickness, reach
length and angle, azimuth and depth. Flow properties, like permeability and porosity, are assigned either by lithology or by
using a random sample. All well data are honored. Net-to-gross
ratio is used to control the quantity of sand generated.

32

In theory, a similar method could be


applied to generate sandstone bodies in a
low-permeability background. However, BP
Research, Sunbury-on-Thames, England, has
developed SIRCHsystem for integrated
reservoir characterizationto generate realistic reservoir heterogeneities using a library
of depositional shapes. Each shape has a set
of attributes defined by sedimentology and
based on observations of depositional systems. These attributes determine the size
and position in the reservoir model of each
shape and its relations with other shapes.
For example, channel belts are constructed as continuous links of a specified
shape.14 As a belt is constructed, input values for the attributes are repeatedly sampled
from the library and the generated belts may
be conditioned to honor well data. The
model can also be faulted, structured to
account for regional dip and clipped back
using the mapped top and bottom of the
reservoir to give it a volume indicated by
structural maps (left ).
SIRCH is still stochastic, so several realizations are required to estimate reservoir
characteristics. The realizations can be used
to estimate the volume of sandstone within
a specified interval, reservoir connectivity
and hydrocarbons in place for a given
radius around a well. This process has
proved valuable in indicating sensitivity of
results to the input parameters. Influential
inputs can then be targeted for extra study.
Individually, all these techniques paint a
portion of the total picture. In most cases,
gaining a full view of the reservoir requires a
hybrid (see The HERESIM Approach to
Reservoir Characterization, next page ).
Object-based, discrete modeling may be
used to describe the large-scale heterogeneities in the reservoirthe sedimentological unitswhile different continuous
models may describe the spatial variations
of properties within each unit. For example,
for fluvial environments, BP Research is
planning to use SIRCH to generate channels, SIS for the facies within them and SGS
for permeability within the facies.
(continued on page 36)

14. Channels are the depositing rivers; channel belts are


the resulting sandstone bodies.

Oilfield Review

The HERESIM Approach to Reservoir Characterization

An example of a software package that combines


traditional geologic analysis with geostatistical
techniques is HERESIM, developed jointly by the
Institute Franais du Ptrole, Rueil-Malmaison,
France, and the Centre de Gostatistiques, Ecole

Geostatistical
analysis
proportion
curves
variograms

Simulation of
lithofacies

Simulation of
petrophysical
data

Scaling up of
petrophysical
data

Nationale Suprieure des Mines de Paris, Foun-

Stratigraphic studies

tainebleau, France. This allows sedimentological

Fluid-flow modeling
Enhanced oil recovery
studies

and geostatistical structural analysis to be carried out on interactive workstations, followed by


geostatistical conditional simulations using
small-scale grid blocks. Each simulation creates

Reinterpretation of the results in


terms of sequential stratigraphy

an image of the reservoir geology, consistent


with well data. Petrophysical data, like perme-

Connectivity studies
Optimization of well spacing
Swept volume estimation

ability and porosity, are then mapped between


wells using deterministic or stochastic
algorithms. Finally, the petrophysical information
is scaled up for fluid-flow simulation (right).1

nOrganization and use of HERESIM, a geostatistical reservoir


characterization package.

First, a sedimentological study is carried out to


subdivide the area under investigation, which can
include formations bordering the reservoir as

Wells

Variogram

100%

well as the reservoir itself. The largest divisions

Sill

tary discontinuities or sudden variations in the


sedimentological environment. Distinction of

Value

cally linked strata, bounded by major sedimen-

Frequency

are depositional units that include sets of geneti-

units is usually achieved by correlating features


between wells. The next step is to identify lithofacies within each depositional unit, collect statistical information about them at the wells and
then generate stochastic realizations of lithofacies between wells.

-1000

1000

Relative distance
Lithofacies
1
2
3

2000

Range

Distance

nHorizontal proportion curve. Proportion


curves provide information on the relative
frequency of each lithofacies in a depositional unit. Here, four lithofacies have been
proportioned using data from 20 wells.

nVariograms quantify the spatial continuity of each lithofacies in the reservoir. The
sill equals the variance of data; while the
range indicates the separation beyond
which two points are uncorrelated.

tial continuity of each lithofacies in the reser-

numerous realizations of lithofacies distribution

voir (far right).2

can be readily obtained for each unit. However,

1. Eschard R, Doligez B, Rahon D, Ravenne C and Leloch G:


A New Approach for Reservoirs Description and Simulation Using Geostatistical Methods, presented at
Advances in Reservoir Technology, Characterization,
Modelling & Management, organized by the Petroleum
Science and Technology Institute, Edinburgh, Scotland,
February 21-22, 1991.
2. Matheron G, Beucher H, de Fouquet C, Galli A, Gurillot D
and Ravenne C: Conditional Simulation of the Geometry
of Fluvio-Deltaic Reservoirs, paper SPE 16753, presented at the 62nd SPE Annual Technical Conference and
Exhibition, Dallas, Texas, USA, September 27-30, 1987.

Statistical information obtained from well data,


seismic interpretation and geologic knowledge are:
Proportion curvesthe percentage occurrence
of each lithofacies in a depositional unit (right)
Experimental variogramsto quantify the spa-

With this information and a form of indicator krig-

the algorithm works only in rectangular formats,

ing employing a Gaussian random function,

so it is necessary to first geometrically transform


each unit into a rectangle. After stochastic realizations have been generated, the real shape of
the unit is restored. HERESIM offers two types of
transform, depending on whether the prevailing
environment since deposition has been dominated by erosion or differential subsidence

January 1992

33

Erosion transformation

Fill in eroded strata to create rectangle, then perform stochastic


realization.

Erode stochastic realization to recreate structure.

Subsidence transformation

Reverse effect of subsidence to create rectangle,then perform


stochastic realization.

Subside stochastic realization to recreate structure.

nA step in HERESIM processing: creating rectangles to allow generation of stochastic realizations. Before the stochastic realization
can be generated, the depositional units have to be transformed into rectangles. HERESIM has two ways of doing this. One assumes
that the unit has been eroded and that all the correlation lines within the unit are parallel. To create a rectangle, these correlations
are extrapolated across the eroded sections of the unit. Then the rest of the structure can be stochastically generated and the resulting realization re-eroded to recreate the original structure. The alternative technique assumes that differential subsidence has
occurred, causing the correlation lines to diverge. To create a rectangle, the effects of the subsidence are reversed. Then the stochastic realization is created and resubsided.

34

Oilfield Review

nStochastic realization generated using HERESIM.


although some forms of fracturing can also be

ties.3 First, average permeability is estimated

accommodated (previous page).

from the root of the product of the arithmetic mean

Having selected the most likely realizations of

and harmonic mean of the grid-block permeabili-

lithofacies, geologists and reservoir engineers

ties. The quality of this value can be judged by

assign porosity and permeability values.

computing the log of the difference between the

HERESIM assumes that petrophysical data are

two means. The larger the difference, the less

strongly related to lithofacies. For each lithofa-

satisfactory the simple average. Grid blocks that

cies, either constant data values can be assigned

fail this test are subjected to a more accurate and

or a Monte Carlo-type distribution employed.

time-consuming fluid-flow simulation.

The geologic model describes a reservoir at a


decametric scale (on average 50-m [165 ft] square
or smaller with a thickness of 0.5 m [1.5 ft]). As a
result, simulated grids are huge (millions of
cells), too big for fluid-flow simulation (above).
Scaling up in HERESIM is performed using a fast

3. Gurillot D, Rudkiewicz JL, Ravenne C, Renard G and


Galli A: An Integrated Model for Computer Aided Reservoir Description: From Outcrop Study to Fluid Flow Simulations, presented at the 5th European Symposium on
Improved Oil Recovery, Budapest, Hungary, April 25-27,
1989. Reprinted in Revue de lInstitut Franais du Ptrole
45, no. 1 (January-February 1990): 71-77.

algorithm that calculates absolute permeabili-

January 1992

35

nModeling the North Sea Rannoch formation. This reservoir sandstone is characterized by hummocky cross-stratified bedforms. The Department of Petroleum
Engineering at Heriot-Watt University
determined the flow performance of this
depositional structure using the two deterministic simulations shown here. These
two images represent a slice of rock measuring 2.4 m [7.9 ft] and 14.4 m [47.2 ft].
In the first (top), saturation is at initial
conditions. Low permeability, rippled
crests have the lowest oil saturations
(green). The second (bottom) shows saturation distribution after 0.2 pore volumes
of water have been injected at a low
advance rate of 0.25 m/d. Saturation is
reduced most rapidly in the low permeability, rippled crests (dark blue).

36

heterogeneity at all scales, a series of scaleup operations can be performed, each dealing with heterogeneity larger than the previous operation. At each stage in the process,
the size of the scale up should be as large as
possible without introducing the next level
of heterogeneity. If a reservoir is divided into
depositional units, lithofacies, beds and
small-scale heterogeneity within a bed, the
scale-up process will have four stages. For
example, lithofacies become the basic building blocks at the depositional unit scale.17
Horizontal Flood
Recovery of original oil-in-place

Geologic modeling can describe a reservoir


with millions of grid blocks. For the computer, this mainly presents problems of storage. But the next major operation is fluidflow simulation, which involves complex
numerical calculationa monumentally
large task given a million grid blocks (see
Simulating Fluid Flow, page 38 ).
Today, fluid-flow simulators can cope
with up to about 50,000 grid blocksthe

greater the number of grid blocks, the more


expensive the fluid-flow simulation is in
computation. So before the simulation is
run, the geologic model, with its high-resolution petrophysical data must be enshrined
in a smaller number of larger grid blocksa
process called scaling up. Reorganizing so
much data may introduce error, masking
heterogeneities in the geologic model that
directly affect simulated production.
The key to achieving scale up is deriving
a single set of reservoir propertiescalled
pseudosfor the larger blocks so that the
fluid-flow simulation after scale up is as
close as possible to what it would have
been with the small-scale data.
Over the years, a variety of methods has
been developed to do this. These include
calculations based on simple averaging
procedures like the oft-quoted Kyte and
Berry method and more complex simulations of parts of the reservoir using smallscale grid blocks, measuring about 1 m
square and 0.5 m thick.15
The impact of small-scale effects on twophase flow performance is revealed in a
study of the North Sea Rannoch formation,
carried out at Heriot-Watt University, Edinburgh, Scotland. Rannoch subfacies are
strongly laminated and rippled. The study
utilized newly acquired minipermeameter
data at a millimetric scale, much denser
than that conventionally obtained from
core analysis.
These data were combined with rock capillary pressure curves to define pseudos for
each lamina subfacies. A geologic model for
hummocky cross-stratification combined the
subfacies at the bedform scale (left ).16 The
simulated flow performance of this model
was then compared with simpler flow simulations calculated using arithmetic and harmonic average permeabilities for horizontal
and vertical directions, respectively.
The flow behavior of the detailed model
was anisotropic, whereas the simpler
approach yielded more isotropic behavior
and significantly higher recoveries for vertical displacement (right ). So, small-scale
heterogeneity affects reservoir producibility,
and in many cases, variations in capillary
pressure with rock type and small-scale
sedimentary structure cannot be ignored.
A predominant theme in scaling up is the
identification of a number of discrete scales
of heterogeneity. To preserve the effects of

80%

Detailed geologic model


Arithmetic average and rock
curves

40%

50%

100%

Total pore volume injected


Recovery of original oil-in-place

Rescaling the Geologic Model For FluidFlow Simulation

Vertical Flood

80%

Detailed geologic model


Harmonic average and
rock curves

40%

0
0

50%

100%

Total pore volume injected

nRannoch formations flooding characteristics. Using densely-sampled permeability data, capillary pressures measured
from appropriate rock types and knowledge of the depositional structure to construct a detailed geologic model, the
Department of Petroleum Engineering at
Heriot-Watt University simulated both horizontal and vertical flooding. The results
using the detailed geologic model are significantly different from those derived
using traditional techniques employing
average permeabilities and rock curves
particularly for the vertical flood.

Oilfield Review

15. Kyte JR and Berry DW: New Pseudo Functions to


Control Numerical Dispersion, Society of
Petroleum Engineers Journal 15 (August 1975): 269276.
Tompang R and Kelkar BG: Prediction of Waterflood Performance in Stratified Reservoirs, paper
SPE 17289, presented at the SPE Permian Basin Oil
and Gas Recovery Conference, Midland, Texas,
USA, March 10-11, 1988.

January 1992

Normalized average value

Heterogeneity can be assessed by measuring the average value of a reservoir property


within a growing volume. When the volume
is very small, the average value will fluctuate
because of microscopic heterogeneities. But
as the averaging volume increases, more of
these heterogeneities will be encompassed
and the average value will stabilize. Further
growth may result in capturing larger-scale
heterogeneity and the value will again fluctuate (right ).18 Since this behavior varies
among properties, the size of the smallest
grid block should be selected so that all key
properties are stable. 19
No matter how many scales are used,
existing methods of calculating pseudos
have their drawbacks. The elementary
methods, using averaged properties, are
quick but not accurate enough. The simulation methods are computationally expensive. And if sensitivity studies are required
using a number of stochastic realizations,
the time and cost of scaling up can be prohibitive. Recently, BP Research developed a
new algorithmreal-space renormalizationthat promises speedy yet accurate calculation of pseudos.
The method starts with a group of eight
grid blocks (222) with permeabilities distributed to represent the original data. A new
relative permeability is calculated and
adopted for a new single block the size of
the eight original blocksit becomes the
starting point for the next renormalization
when the process is repeated with seven
other similarly generated blocks. The new
blocks effective relative permeability is calculated using a semianalytical technique that
depends on an analogy between Darcys law
and Ohms lawblock permeabilities are
modeled by an equivalent resistor network.
Through a series of transformations, the
resistor network is reduced to a single resis-

Microscopic

Macroscopic

Megascopic

Volume
Fluid density ( )

nHow sample volume affects


property averages. At small sample volumes, property averages
fluctuate because of microscopic
heterogeneity. As sample volume
is increased, heterogeneities will
average out and properties stabilizethis gives the representative
elementary volume (REV). This stability continues until larger-scale
heterogeneity begins to take
effect. Different properties have
different volumes where heterogeneity is stable. The ideal sample
volume should fall within ranges
of stability for all key properties.

0
Porosity ()

the range of REV

0
Permeability (k )

the range of REV

0
the range of REVk

the range of REV for all


properties
16. Corbett PWM and Jensen JL: An Application of
Small Scale Permeability MeasurementsPrediction
of Flow Performance in a Rannoch Facies, Lower
Brent Group, North Sea, presented at Minipermeametry in Reservoir Studies, organized by the
Petroleum Science and Technology Institute, Edinburgh, Scotland, June 27, 1991.
17. Lasseter TJ, Waggoner JR and Lake LW: Reservoir
Heterogeneities and Their Influence on Ultimate
Recovery, in Lake LW and Carroll HB Jr (eds):
Reservoir Characterization. Orlando, Florida, USA:
Academic Press Inc. (1986): 545-559.
18. Any volume for which the property is stable is called
a representative elementary volume (REV).

19. Haldorsen HH: Simulator Parameter Assignment


and the Problem of Scale in Reservoir Engineering,
in Lake LW and Carroll HB Jr (eds): Reservoir Characterization. Orlando, Florida, USA: Academic Press
Inc. (1986): 293-340.
Kossack CA, Aasen JO and Opdal ST: Scaling Up
Heterogeneities With Pseudofunctions, SPE Formation Evaluation 5 (September 1990): 226-232.
Norris RJ and Lewis JJM: The Geological Modeling
of Effective Permeability in Complex Heterolithic
Facies, paper SPE 22692, presented at the 66th SPE
Annual Technical Conference and Exhibition, Dallas, Texas, USA, October 6-9, 1991.

37

Simulating Fluid Flow

Today, most simulators solve fluid-flow problems

model. Adjustments should be consistent with

initial production rates. To combat this, Shell has

by segmenting a portion of the reservoir into a

measured data and improve upon postulated vari-

devised a simulator that allows flow in all direc-

series of grid blockseither two- or three-dimen-

ations in reservoir properties that cannot be or

tions between neighboring blocks.

sional. The fluid phases in each block are mod-

have not been measured. The results of the simu-

Fluid-flow simulation for horizontal wells is

eled with finite difference equations similar to

lation can be extremely sensitive to variations in

currently being approached in at least two ways.

the conventional volumetric material balance

the reservoir description. Models for reservoir

The productivity of a horizontal well can be

equations. Darcys law is then used to describe

management are usually routinely updated as

approximated, using numerical solutions of sim-

fluid flow between grid blocks.

new production data or reservoir information are

plified equations, and the approximation used in

acquired. This allows for reexamination of cur-

a conventional reservoir simulator. Alternatively,

1940s, began with simple material balance cal-

rent and future production scenarios. But history

for single wells in which short-term performance

culations. Throughout the 1960s, black oil simu-

matching always results in nonunique combina-

is being studied, fine-scale, black oil simulations

lations dominated the scene. These describe

tions of variables. And it could be said that the only

have been used. In this approach, pseudos can

nonvolatile oil as a single component containing

fully simulated reservoir is a fully depleted one.

be derived for a coarser, field-sized simulation.

Early efforts in reservoir simulation, in the late

gas in solution as a second component. There are

Simulator technology is by no means mature.

three compressible, immiscible phases: gas,

Some current developments center on improving

water and oil. Recovery methods simulated

the ability to cope with different production sce-

included pressure depletion and some forms of

narios, for example, naturally fractured forma-

pressure maintenance.1

tions and horizontal wells.

The advent of enhanced recovery techniques

During production by gravity drainage in natu-

such as chemical flooding, steamflooding and in-

rally fractured reservoirs, oil drains from matrix

situ combustion, ushered in more sophisticated

blocks into the fracture system. From there it can

simulators. Most describe the hydrocarbon in

pass either into other blocks below or along frac-

terms of a number of components and solve ther-

tures. A fractured reservoir is modeled using two

modynamic equilibrium equations to determine

interacting continua, one for the matrix and one

the distribution of components between the liquid

for the fracturesthe so-called dual porosity

and vapor phases in the reservoir. 2

concept. The matrix contains most of the oil and

A common strategy for fluid-flow simulation is


to use history matching to adjust the geologic

1. Mattax CC and Dalton RL: Reservoir Simulation, Journal of Petroleum Technology 42 (June 1990): 692-695.
Breitenbach EA: Reservoir Simulation: State of the Art,
Journal of Petroleum Technology 43 (September 1991):
1033-1036.
2. Cheshire IM and Pollard RK: Advanced Numerical Techniques for Reservoir Simulation and Their Use on Vector
and Parallel Processors, in Edwards SF and King PR
(eds): Mathematics in Oil Production. Oxford Science
Publications, Oxford, England: Clarendon Press (1988):
253-268.
3. Por GJ, Boerrigter P, Maas JG and de Vries A: A Fractured Reservoir Simulator Capable of Modeling BlockBlock Interaction, paper SPE 19807, presented at the
64th SPE Annual Technical Conference and Exhibition,
San Antonio, Texas, USA, October 8-11, 1990.

the fractures most of the conductivity. So transmissibility between matrix blocks is usually
ignored. But to correctly model ultimate recovery,
the capillary contacts between matrix blocks
should be considered.
Work at Koninklijke/Shell Exploratie en Produktie Laboratorium, Rijswijk, The Netherlands, has
shown that a vertical stack of oil-saturated matrix
blocks surrounded by gas does not drain independently.3 Oil from one block passing into the fracture system will be absorbed by the underlying
block, slowing production. Therefore, a flow simulation that ignores this effect will overestimate

38

Oilfield Review

tance, equivalent to the new blocks effective


relative permeability.
The effect of each stage of renormalization is to produce larger blocks whose permeability approaches that of the whole.
Results of renormalization have been compared with direct numerical simulation and
the maximum error was found to be only
7%.20 The main difference was in computational effort. Renormalization speeds are up
to two orders of magnitude faster than those
of direct numerical simulation. The largest
problem yet tackled totalled 540 million
grid blocks.
Renormalization has drawbacks: it is difficult to represent contorted flow paths with
small cells and the estimation of effective
permeability suffers. This is seen in very
shaly reservoirs, where there are high contrasts between neighboring permeabilities.
This problem may be resolved by another
method that short-circuits the need for scale
up and requires a different kind of reservoir
characterization. This is the streamtube
approach, championed by Chevron Oilfield
Research Co., La Habra, California, USA,
among others.21 The plan view of the reservoir is simulated as a network of tubes that
carry the flow between injection and production wells (right ). The technique takes
into account well placement, areal heterogeneity and the relative flow rates of wells.
The key to the techniques success is a preliminary flow simulation on two-dimensional vertical slices between wells. The
observed scaling behavior is then mapped
onto individual streamtubes and their contributions summed to get a three-dimensional
prediction of reservoir performance.
Because of the relatively greater arithmetic complexity of fluid-flow simulation
compared with model building, computers

January 1992

nCutting simulation time using streamtubes, a network of conduits conveying fluid


from injection to production wells. This field-scale model incorporates five solvent injection wells (pink) and seven producers (yellow). Using streamtubes, simulating the injection of 2.5 pore volumes of solvent took just 30 seconds on a MicroVAX 3200 workstation. Using conventional fluid-flow simulation for a coarse grid and a Cray XMP-48, the
calculation required 630 seconds. The two calculations predicted similar solvent breakthrough. (After Hewett and Behrens, reference 21.)
may always be able to generate more complex geologic models than can be simulated. Scaling up will therefore remain a key
element in reservoir characterization.
The fluid-flow simulation bottleneck will
have to be addressed, either by developing
new, more rapid processing hardware like
parallel processing 22 or by smarter algorithms, or probably both. Then, the capability for cost-effective and repeated stochastic
simulation will be realized. The impact of
this capability on reservoir management,
and ultimately on improved recovery,
remains to be felt.
CF

20. King PR: The Use of Renormalization for Calculating Effective Permeability, Transport in Porous
Media 4 (July 1989): 37-58.
King PR, Muggeridge AH and Price WG: Renormalization Calculations of Immiscible Flow, Submitted to Transport in Porous Media, August 1991.
21. Emanuel AS, Alameda GK, Behrens, RA and Hewett
TA: Reservoir Performance Prediction Methods
Based on Fractal Geostatistics, SPE Reservoir Engineering 4 (August 1989): 311-318.
Hewett TA and Behrens RA: Scaling Laws in Reservoir Simulation and Their Use in Hybrid Finite Difference/Streamtube Approach to Simulating the the
Effects of Permeability Heterogeneity, in Lake LW,
Carroll HB Jr and Wesson TC (eds): Reservoir Characterization II. San Diego, California, USA: Academic Press Inc. (1991): 402-441.
22. Mayer DF: Application of Reservoir Simulation
Models to a New Parallel Computing System, paper
SPE 19121, presented at the SPE Petroleum Computer Conference, San Antonio, Texas, USA, June
26-28, 1990.

39

You might also like