Application of Petrophysics in Seismic Interpretation: BY RAJDEEP BURAGOHAIN (R280308025) B.Tech (GSE)
Application of Petrophysics in Seismic Interpretation: BY RAJDEEP BURAGOHAIN (R280308025) B.Tech (GSE)
Introduction
Well logs are sometimes viewed by geophysicists as "hard data" and not subjected to the same level of scrutiny as the seismic data. This can be a mistake because well logs are susceptible to errors from a number of sources. In this project we will examine some of the processes and procedures that allow well logs to be correctly used in Seismic Reservoir Characterization. The basic steps in seismic petrophysics analysis are: Collect and organize input data Perform geophysical log interpretation for volume minerals, porosity, and fluids Determine fluid properties (oil API, brine salinity, etc.) and reservoir pressuretemperature Perturb reservoir properties using rock physics effective medium models (pseudowell modeling) Compute synthetic seismic traces Generate trend curves and crossplots Create graphics and digital output files.
In these cases, a combination of theoretical, empirical, and heuristic models can be applied to attempt to repair the bad or missing data. A common example is the problem of mud filtrate invasion (Walls, et al., 2001; Vasquez, et al., 2004). Mud filtrate invasion occurs during drilling with over-balanced mud weight conditions. The positive pressure gradient between the wellbore and the formation causes some of the mud liquids to penetrate into the permeable zones, displacing original fluids near the borehole wall. The severity of this condition varies greatly depending on permeability, mud weight, mud type, and original fluid saturation. The implications for reservoir geophysics are primarily related to the density log and sonic logs. These two logs sample rock properties close the the borehole wall. Notice that density and monopole sonic are likely sampling the invaded zone. The invaded zone in this example will have higher water saturation than the un-invaded gas sand reservoir. If synthetic seismograms are made from the un-corrected sonic and density logs, the results will not match the seismic data.
. These models allow us to make a much improved interpretation of the acoustic and elastic impedance inversion. For example, we can say with certainty that acoustic impedance inversion alone will not be enough to discriminate oil from wet sand. However, negative seismic Poissons ratio anomalies will be indicative of oil saturation, while the wet sand will have almost no Poissons ratio anomaly.
Survey design
Once a field has been discovered, developed, and under production for some time, quite a bit of information is available to the geophysicist to design a geophysical survey in such a manner as to maximize the likelihood that the data collected will optimize the interpretation. That is, if the goal of the survey is to define the structural limits of the field, a 3-D seismic survey can be designed with that in mind. If, however, the goal of the survey is to define the extent of a gas zone, the geophysicist may be able to use log data, seismic petrophysical modeling, and old (legacy) seismic data to determine whether a certain offset range is required to differentiate between the water and gas zones. If highly accurate well ties or wavelet-phase control are needed, an appropriately placed vertical seismic profile (VSP) may be designed. Or, if an acquisition footprint had been observed in a previously acquired seismic data set and that footprint obscured the attributes used to define the reservoir target, the geophysicist can design the new survey to eliminate the troublesome artifacts. In short, the fact that the target is well known gives the reservoir geophysicist a distinct advantage over the exploration geophysicist by allowing the survey to be designed in a more enlightened manner than a typical exploration survey ever can be. It is often easier to justify the expense of a properly conducted seismic survey for reservoir characterization purposes because the financial impact of the survey can be calculated with greater confidence and the financial returns realized more quickly than is typically the case for exploration seismic surveys.
3-D SEISMIC:
Most reservoir geophysics is based on reflection seismic data, although a wide variety of other techniques are employed regularly on specific projects. Almost all seismic data collected for reservoir studies is high-fold 3-D vertical-receiver data; however,the use of converted-wave data with multiple component geophones on land and on the sea floor, and multicomponent source (on land) is increasing. In particular, in order to image below gas clouds that obscure P-wave imaging of reservoirs, converted waves are now being used, and the technology to obtain multiple-component data from the ocean bottom is continually improving. The importance of fractures in many reservoir development schemes has led to a number of experimental programs for multicomponent sources and receivers in an effort to
identify shear-wave splitting (and other features) associated with high fracture density. Some of these techniques will find continually increasing application in the future, but at the present, most surface seismic studies designed to characterize existing reservoirs are highquality 3-D vertical-component receiver surveys. Many good case histories of the use of 3-D seismic data for reservoir development purposes can be found in the collection byWeimer and Davis (1996). Case histories using 3-D seismic for unconventional reservoir characterization purposes include MacBeth and Li (1999) and Lynn et al. (1999). A current example for the use of converted waves in ocean-bottom surveysover a poor-data area (the result of a gas chimney) is provided by Thomsen et al. (1997).
Attributes:
In most exploration and reservoir seismic surveys, the main objectives are (in order) to correctly image the structure in time and depth, and to correctly characterize the amplitudes of the reflections in both the stacked and prestack domains. From these data, a host of additional features can be derived, and used in interpretation. Collectively, these features are referred to as seismic attributes (Taner et al. 1979). The simplest attribute, and the one most widely used, is seismic amplitude, and it is usually reported as the maximum (positive or negative) amplitude value at each common midpoint (CMP) along a horizon picked from a 3-D volume. It is fortunate that, in many cases, the amplitude of a reflection corresponds directly to the porosity of the underlying formation, or perhaps to the density (and compressibility) of the fluid occupying pore spaces in that formation. The assumption is that amplitude is proportional to RO, and the simple convolutional model is often appropriate for interpretation of the data in such cases. But it isnt always this simple, and many mistakes of interpretation have occurred by making this assumption. For one thing, the convolutional model may not be appropriate for use in many instances, particularly if the offset dependence of a reflection is important in its interpretation. Likewise, the interpretation of porosity or fluid properties as the cause of a true impedance change is often overly optimistic, especially in sands containing clays or in rocks with fractures. The use of seismic attributes extends well beyond simple amplitudes. Most of the original seismic attributes were based on the Hilbert transform and consisted of the instantaneous amplitude (or amplitude of the wave envelope), the instantaneous phase (most useful for accurate time picking), and the instantaneous frequency (probably most often associated with thin-bed reverberations, but often interpreted, perhaps incorrectly, as resulting from attenuation due to gas bubbles).Variations on these attributes evolved, and other classesof attributes came into use. For example, coherence is the attribute of waveform similarity among neighbouring traces and is often used to identify fractures (Marfurt et al., 1998). Dip and azimuth describe the direction of trace offset for maximum similarity and can yield finely detailed images of bed surfaces.There are now over two hundred attributes in use in some geophysical processing or interpretation software packages (Chen and Sidney, 1997); many of these attributes result from slightly differing approaches to determining a specific property, such as frequency or amplitude. Care must be taken in applying traditional
attribute analysis in thin-bed areas, where the interference from the thin beds themselves can obscure the traditional attribute interpretations (see the section in this paper on ultrathin beds for more details).
Well Calibration:
With so many attributes available to choose from, it is vital that the reservoir geophysicist make careful use of calibration at wellbores, using log data, core data, and borehole seismic information available in order to test the correlation of attributes with rock properties. Again, the reservoir geophysicist enjoys significant advantages over the exploration geophysicist, who cannot always tie the seismic data and its character (attributes) to properties of the formation as evidenced from the well data. It is important that the reservoir geophysicist make use of all the information and expertise available within the asset team to provide the tightest possible calibration; otherwise, the advantage of performing reservoir geophysical studies is lost. It is simple to correlate the attribute of interest with the well-log (or log-derived) data of interest; a strong correlation between, say, seismic amplitude and porosity is often enough to convince many workers that the correlation is meaningful and that seismic amplitude can be used as a proxy for porosity in reservoir characterization. There are many potential pitfalls in this approach, as one may imagine (Kalkomey, 1997; Hirsche et al., 1998). Statistical tests should be performed on the well correlations, and geologic inference should be brought in to test the reasonableness of the results and, most importantly, the physical basis for the behavior of an observed attribute.
Geostatistics:
In reservoir characterization, the asset team usually has a number of wells at its disposal from which to draw inferences about the reservoir in general. With the availability of these wells comes a dilemma: How do you make use of the spatial distribution of the data at hand? Simple averaging between wells can easily be seen to lead to misleading results, and a technique called kriging was developed for use when features can be observed to correlate over certain distances. The technique has been refined to include the use of data that provides additional soft evidence between the hard data locations at wells, and seismic data often provides that soft evidence. Essentially, if a statistical (and physically meaningful) correlation is found to exist between formation parameters observedat wells and some seismic attribute observed throughout the study area, geostatistical techniques are available that allow the hard data at the wells to be honored and to be interpolated (generally using kriging techniques) between the wells, while honoring the seismic interpretation to a greater or lesser degree. In the absence of seismic data, various realizations of the possible interwell regions can be generated using advanced geostatistical techniques, each realization being just as likely to occur as any other. But in the presence of seismic data with reliable predictive capabilities, the range of such models
can be greatly reduced. The problem of reservoir characterization then can become less stochastic and more deterministic, although the correlations are never perfect, and a range of likely models should always be considered. A number of good references exist from which one can learn geostatistical approaches. These include Dubrule (1998); Jensen et al. (1997), and Isaaks and Srivastava (1989). A good collection of case histories is presented by Yarus and Chambers(1995). Ultra-thin beds In recent years, a couple of techniques in particular have been developed that appear to help the interpreter identify properties of extremely thin beds, well below what has traditionally been considered the quarter-wavelength resolution of seismic data. These techniques make use of the various frequency components within a band-limited seismic wavelet; one operates in the frequency domain, and the other in the time domain. The frequency-domain approach (see, for example, Partyka et al., 1999) called spectral decomposition, looks for notches in the frequency band representing a sort of ghost signal from the interference of the reflections from the top and bottom of the thin bed. The frequency at which that ghost, or spectral notch, occurs corresponds to twice the (two-way) time thickness of the bed. Because the seismic wavelet contains frequencies well above the predominant frequency, spectral notches can be indicative of extremely thin beds.The thinning out of a channel or shoreline, for example, can be observed by mapping the locations of successively higher-frequency notches in the spectrum. The time-domain approach involves matching wavelet character, often using a neural-network technique (Poupon et al.,1999); the wavelet along a given horizon can be classified into several different wavelets, perhaps differing from each otheronly in subtle ways. The resulting map of classified wavelets can often resemble a map of the geologic feature being sought. The classification tends to compare relative amplitudes (side lobes versus main lobes, for example), shoulders on a main peak or trough, or slight changes in period, for example, and therefore often responds to interference from features below wavelet resolution. Both of these techniques run the risk of leading to incorrect interpretations if seismic petrophysical modeling is not performed to direct the analysis and interpretation or to confirm the results. It is becoming increasingly easy for a reservoir geophysicist to make use of advanced computer programs as black boxes that provide a pretty picture and thereby be lulled into a false sense of security in the interpretation. Fortunately, most software packages currently available include the modelling capabilities required to test the results, but the tests areonly as complete as the reservoir geophysicist is able to make them.
Focused Approaches:
Because the good reservoir geophysicist has analyzed the target of the study, has calibrated legacy seismic data to wells, and has investigated the seismic petrophysical responses of the various scenarios anticipated in the reservoir, there is an opportunity to collect that data, and only that data, which will be required to observe the features of interest. For example, one could collect, say, only far-offset seismic data if one were convinced that the far offsets contained all the information that was essential to the study (Houston and Kinsland, 1998). It is not clear that such highly focused approaches are being used, however, probably because the cost savings do not warrant the added risk of missing an important piece of data. There may also be a natural aversion to collecting, purposefully, data that are not as good or complete as conventionally acquired seismic data, even though this approach would be a good marriage of the scientific method (collect data that is designed to support or disprove a hypothesis) and engineering pragmatism (get the job done, and produce hydrocarbons in a timely and efficient manner).
Bore Geophysics:
The reservoir geophysicist not only has the advantage of using well data for correlation, the advantage extends to using those wells for the collection of novel geophysical data, from below the noisy surface or weathered zone, and very close to the target itself. New techniques for acquisition of seismic data from within wellbores are available, and may become important tools in the arsenal of the reservoir geophysicist in the near future. The seismic sources and/or receivers can be placed in one well or in neighboring wells or on the surface, and the object of the analysis can be either the velocity field or the detailed reflection image near the wells. In order to qualify as borehole geophysics, either the source or the receiver, at least, must be in a wellbore; beyond that, almost as many geometrical arrangements as can be imagined have been tested or seriously proposed.
can only be found behind casing due to the inability to log open-hole the depths in which shales are flowing or collapsing.
Methodology: 1. Case studies of various petroliferous areas will be studied related to the tropic. 2. Basic of petrophysics and seismic interpretation will be mentioned in my project. 3. Diagrams will shown if required with all the datas available. Time progress chart : 1. 1st November to 6th November : collection of various information regarding the
tropic.
2. 6th November to 18th November : studying the various datas & information
collected.
3. 18th November to 25th November: discussion with the project mentor. 4. 25th November to 2nd November: final touch will be given to the project and editing.
Conclusion:
Using various petrophysical parameters while seismic interpretation we can study the reservoir characteristic. We can very well establish the porosity, permeability as well as very such petrophysical parameters to know all the detail characteristic of our reservoir. Petrophysical parameters are used during seismic interpretation for perfection of the interpretation job. Good synthetic can be developed using petrophysical datas which is very much important for seismic interpretation.
References:
1. Wayne D. Pennington, Reservoir geophysics. 2. Joel Walls, Jack Dvorkin, Matt Carr Rock Solid Images, Well Logs and Rock Physics in Seismic Reservoir Characterization