Reservoir Geophysics
Geophysical techniques are widely depends on the Data and statistics. NoDoC includes a range of models, database and assemblies for these methods. The subjects that are considered in the NoDoC are explained below. It should be noted thatNoDoC only supports that part of the techniques that are realted to the cost estimation. NoDoC Scope in this section are:
• What is required model resolution
• Depositional environment variables
• Are facies proportions available
• Seismic-log calibration available
• Porosity and permeability relationship
• Net pay cutoff
• Horizons/faults surfaces generation
• 3D stratigraphic grid construction
• Spatial statistic analysis
• Facies distribution
• Distribution of seismic/petro relationships
• Permeability distribution
• Net-to-Gross estimation
The concept of petroleum reservoir geophysics is relatively new. In the past, the role of geophysics was largely confined toexploration and, to a lesser degree, the development of discoveries. As cost-efficiency has taken over as a driving force in the economics of the oil and gas industry and as major assets near abandonment, geophysics has increasingly been recognized as a tool for improving the bottom line closer to the wellhead. The reliability of geophysical surveys, particularly seismic, has greatly reduced the risk associated with drilling wells in existing fields, and the ability to add geophysical constraints to statistical models has provided a mechanism for directly delivering geophysical results to the reservoir engineer.
DIFFERENCES BETWEEN EXPLORATIONS AND RESERVOIR GEOPHYSICS
There are several specific differences between explorations geophysics and reservoir geophysics, as the term is usually intended. These include the assumption that well control is available within the area of the geophysical survey, that a well designed geophysical survey can be conducted at a level of detail that will be useful, and that some understanding of the rock physics is available for interpretation.
WELL CONTROL
In exploration, we often require extrapolating well data from far outside the area of interest, crossing faults, sequence boundaries, and occasionally worse discontinuities. The availability of “analogs” is an important component of exploration, and the level of confidence on the resulting interpretation is necessarily limited. In reservoir geophysics, it is generally assumed that a reservoir is already under production (or at least at a late stage of development) and that wells are available for analysis.
These wells provide a variety of information. From the petro physicist, we receive edited and interpreted well log data, describing the lithology (including the mineralogy, porosity, and Michigan Technological University, Department of Geological perhaps even the morphology of the pore spaces), the fluid content (sometimes related to logged conditions, sometimes to virgin reservoir conditions), and detailed depth constraints on geologic horizons. From the production and reservoir engineers, we receive an estimate of the proximity to boundaries, aquifers, or other features of interest. The reservoir engineer can also provide a good estimate of the total volume of the reservoir, and the asset team relates this to the geologic interpretation, determining the need for surveys at increased resolution. From a combination of sources, we obtain additional information about the in-situ conditions of the reservoir, including the formation temperature, pressure, and the properties of the oil, gas, and brine. The geophysicist should be familiar with the usefulness and limitations of petrophysical and reservoir engineering studies, and should be able to ask intelligent questions of the experts in those fields. But the geophysicist need not become an expert in those areas in order to work with the specialists and to design a new experiment to solve reservoir problems.
ROCK PHYSYCS CONTROL
One of the major questions a geophysicist is asked, or should ask independently, is this: Will the geophysical technique being proposed be able to differentiate between the competing reservoir models sufficiently well to be worth the effort and cost? The answer lies not just in the geophysical model, but in the rock physics—or the “seismic petro physics”—of the reservoir rock and neighboring formations. The presence of wells and the possibility that some core samples are available greatly improve the capability of the reservoir geophysicist to address this question. Logs, particularly sonic logs of compression and shear velocities combined with image logs providing fracture information, can be used (carefully) to provide basic seismic properties, which in turn are modeled for varying litho logic character, fluid content, and in-situconditions (such as pore pressure). The core samples can be used to provide the basis for a theoretical framework, or measurements on them can be used (again, carefully) to provide the same basic seismic properties. The geophysicist must always be on the alert for accidental misuse of the input data, and concerned with scaling properties, particularly the possibility that physical effects observed at one scale (such as the squirt flow mechanism for saturated rocks at high frequencies) not be mistakenly applied at other scales. Sometimes, a little knowledge can be a dangerous weapon; an incomplete evaluation of the seismic petrophysical aspects of the formation can lead either to incorrect results or interpretations.
SURVEY DESIGN
Once a field has been discovered, developed, and under production for some time, quite a bit of information is available to the geophysicist to design a geophysical survey in such a manner as to maximize the likelihood that the data collected will optimize the interpretation. That is, if the goal of the survey is to define the structural limits of the field, a 3-D seismic survey can be designed with that in mind. If, however, the goal of the survey is to define the extent of a gas zone, the geophysicist may be able to use log data, seismic petrophysical modeling, and old (legacy) seismic data to determine whether a certain offset range is required to differentiate between the water and gas zones. If highly accurate well ties or wavelet-phase control are needed, an appropriately placed vertical seismic profile (VSP) may be designed. Or, if an acquisition footprint had been observed in a previously acquired seismic data set and that footprint obscured the attributes used to define the reservoir target, the geophysicist can design the new survey to eliminate the troublesome artifacts. In short, the fact that the target is well known gives the reservoir geophysicist a distinct advantage over the exploration geophysicist by allowing the survey to be designed in a more enlightened manner than a typical exploration survey ever can be. It is often easier to justify the expense of a properly conducted seismic survey for reservoir characterization purposes because the financial impact of the survey can be calculated with greater confidence and the financial returns realized more quickly than is typically the case for exploration seismic surveys .Procedures for planning 3-D seismic surveys have been undergoing rapid change over the past few years,
3-D SEISMIC
Most reservoir geophysics is based on reflection seismic data, although a wide variety of other techniques are employed regularly on specific projects. Almost all seismic data collected for reservoir studies is high-fold 3-D vertical-receiver data; however, the use of converted-wave data with multiple component geophones on land and on the sea floor, and multi component source (on land) is increasing. In particular, in order to image below gas clouds that obscure P-wave imaging of reservoirs, converted waves are now being used, and the technology to obtain multiple-component data from the ocean bottom is continually improving. The importance of fractures in many reservoir development schemes has led to a number of experimental programs for multi component sources and receivers in an effort to identify shear-wave splitting (and other features) associated with high fracture density. Some of these techniques will find continually increasing application in the future, but at the present, most surface seismic studies designed to characterize existing reservoirs are high-quality 3-D vertical-component receiver surveys.
ATTRIBUTES
In most exploration and reservoir seismic surveys, the main objectives are (in order) to correctly image the structure in time and depth, and to correctly characterize the amplitudes of the reflections in both the stacked and pre stack domains. From these data, a host of additional features can be derived, and used in interpretation. Collectively, these features are referred to as seismic attributes. The simplest attribute, and the one most widely used, is seismic amplitude, and it is usually reported as the maximum (positive or negative) amplitude value at each common midpoint (CMP) along a horizon picked from a 3-D volume. It is fortunate that, in many cases, the amplitude of a reflection corresponds directly to the porosity of the underlying formation, or perhaps to the density (and compressibility) of the fluid occupying pore spaces in that formation. The assumption is that amplitude is proportional to RO, and the simple convolution model is often appropriate for interpretation of the data in such cases. But it isn’t always this simple, and many mistakes of interpretation have occurred by making this assumption. For one thing, the convolution model may not be appropriate for use in many instances, particularly if the offset dependence of a reflection is important in its interpretation. Likewise, the interpretation of porosity or fluid properties as the cause of a true impedance change is often overly optimistic, especially in sands containing clays or in rocks with fractures.
The use of seismic attributes extends well beyond simple amplitudes. Most of the “original” seismic attributes were based on the Hilbert transform and consisted of the instantaneous amplitude (or amplitude of the wave envelope), the instantaneous phase (most useful for accurate time picking), and the instantaneous frequency (probably most often associated with thin-bed reverberations, but often interpreted, perhaps incorrectly, as resulting from attenuation due to gas bubbles). Variations on these attributes evolved, and other classes of attributes came into use. For example, coherence is the attribute of waveform similarity among neighboring traces and is often used to identify fractures (Marfurt et al., 1998). Dip and azimuth describe the direction of trace offset for maximum similarity and can yield finely detailed images of bed surfaces. There are now over two hundred attributes in use in some geophysical processing or interpretation software packages; many of these attributes result from slightly differing approaches to determining a specific property, such as frequency or amplitude. Care must be taken in applying traditional attribute analysis in thin-bed areas, where the interference from the thin beds themselves can obscure the traditional attribute interpretations.
WELL CALIBRATION
With so many attributes available to choose from, it is vital that the reservoir geophysicist make careful use of calibration at wellbores, using log data, core data, and borehole seismic information available in order to test the correlation of attributes with rock properties. Again, the reservoir geophysicist enjoys significant advantages over the exploration geophysicist, who cannot always tie the seismic data and its character (attributes) to properties of the formation as evidenced from the well data. It is important that the reservoir geophysicist make use of all the information and expertise available within the asset team to provide the tightest possible calibration; otherwise, the advantage of performing reservoir geophysical studies is lost. It is simple to correlate the attribute of interest with the well-log (or log-derived) data of interest; a strong correlation between, say, seismic amplitude and porosity is often enough to convince many workers that the correlation is meaningful and that seismic amplitude can be used as a proxy for porosity in reservoir characterization. There are many potential pitfalls in this approach, as one may imagine. Statistical tests should be performed on the well correlations, and geologic inference should be brought in to test the reasonableness of the results and, most importantly, the physical basis for the behavior of an observed attribute.
GEOSTATISTICS
In reservoir characterization, the asset team usually has a number of wells at its disposal from which to draw inferences about the reservoir in general. With the availability of these wells comes a dilemma: How do you make use of the spatial distribution of the data at hand? Simple averaging between wells can easily be seen to lead to misleading results, and a technique called kriging was developed for use when features can be observed to correlate over certain distances. The technique has been refined to include the use of data that provides additional “soft” evidence between the “hard” data locations at wells, and seismic data often provides that soft evidence. Essentially, if a statistical (and physically meaningful) correlation is found to exist between formation parameters observed at wells and some seismic attribute observed throughout the study area, geostatistical techniques are available that allow the hard data at the wells to be honored and to be interpolated (generally using kriging techniques) between the wells, while honoring the seismic interpretation to a greater or lesser degree. In the absence of seismic data, various “realizations” of the possible inter well regions can be generated using advanced geostatistical techniques, each realization being just as likely to occur as any other. But in the presence of seismic data with reliable predictive capabilities, the range of such models can be greatly reduced. The problem of reservoir characterization then can become less stochastic and more deterministic, although the correlations are never perfect, and a range of likely models should always be considered.
ULTRA-THIN BEDS
recent years, a couple of techniques in particular have been developed that appear to help the interpreter identify properties of extremely thin beds, well below what has traditionally been considered the quarter-wavelength resolution of seismic data. These techniques make use of the various frequency components within a band-limited seismic wavelet; one operates in the frequency domain, and the other in the time domain. The frequency-domain approach called spectral decomposition, looks for notches in the frequency band representing a sort of ghost signal from the interference of the reflections from the top and bottom of the thin bed. The frequency at which that ghost, or spectral notch, occurs corresponds to twice the (two-way) time thickness of the bed. Because the seismic wavelet contains frequencies well above the predominant frequency, spectral notches can be indicative of extremely thin beds. The thinning out of a channel or shoreline, for example, can be observed by mapping the locations of successively higher-frequency notches in the spectrum. The time-domain approach involves matching wavelet character, often using a neural-network technique; the wavelet along a given horizon can be classified into several different wavelets, perhaps differing from each other only in subtle ways. The resulting map of classified wavelets can often resemble a map of the geologic feature being sought. The classification tends to compare relative amplitudes (side lobes versus main lobes, for example), “shoulders” on a main peak or trough, or slight changes in period, for example, and therefore often responds to interference from features below wavelet resolution. Both of these techniques run the risk of leading to incorrect interpretations if seismic petrophysical modeling is not performed to direct the analysis and interpretation or to confirm the results. It is becoming increasingly easy for a reservoir geophysicist to make use of advanced computer programs as black boxes that provide a pretty picture and thereby be lulled into a false sense of security in the interpretation. Fortunately, most software packages currently available include the modeling capabilities required to test the results, but the tests are only as complete as the reservoir geophysicist is able to make them.
FOCUSED APPROAVHES
Because the good reservoir geophysicist has analyzed the target of the study, has calibrated legacy seismic data to wells, and has investigated the seismic petrophysical responses of the various scenarios anticipated in the reservoir, there is an opportunity to collect that data, and only that data, which will be required to observe the features of interest. For example, one could collect, say, only far-offset seismic data if one were convinced that the far offsets contained all the information that was essential to the study. It is not clear that such highly focused approaches are being used, however, probably because the cost savings do not warrant the added risk of missing an important piece of data. There may also be a natural aversion to collecting, purposefully, data that are not as “good” or “complete” as conventionally acquired seismic data, even though this approach would be a good marriage of the scientific method (collect data that is designed to support or disprove a hypothesis) and engineering pragmatism (get the job done, and produce hydrocarbons in a timely and efficient manner).
BOREHOLE GEOPHYSICS
The reservoir geophysicist not only has the advantage of using well data for correlation, the advantage extends to using those wells for the collection of novel geophysical data, from below the noisy surface or weathered zone, and very close to the target itself. New techniques for acquisition of seismic data from within wellbores are available, and may become important tools in the arsenal of the reservoir geophysicist in the near future. The seismic sources and/or receivers can be placed in one well or in neighboring wells or on the surface, and the object of the analysis can be either the velocity field or the detailed reflection image near the wells. In order to qualify as borehole geophysics, either the source or the receiver, at least, must be in a wellbore; beyond that, almost as many geometrical arrangements as can be imagined have been tested or seriously proposed.
VSPs,CHECK SHOTS, SONIC LOGGING AND THROUGH-CASING SONIC LOGGING
The more conventional borehole geophysical techniques include VSPs, check shot surveys, traditional sonic logging, and sonic logging through casing. All of these techniques were developed primarily to assist in the tie between surface seismic data and well observations, but they have been extended beyond that in many cases. VSPs provide the best data for detailed event identification and wavelet determination (including phase); but they can also be used to image the near-wellbore environment, and the image can be improved if a number of offsets are used for the source location. Modern sonic logging tools can provide a good measure of compression and shear velocities, values required for the calibrated study of the effect of fluid substitution on seismic data; of course, the interpreter must be careful to know if the data represent invaded or un invaded conditions, and make appropriate corrections if necessary. And modern sonic logging tools can often provide reliable values for velocities through casing; often, the most reliable figures for soft shale can only be found behind casing due to the inability to log open-hole the depths in which shale are flowing or collapsing.
CROSS WELL, RVSP, AND SINGLE-WELL IMAGING
Recent extensions of borehole geophysical techniques involve placing a powerful seismic source in one well; the receivers may be in another well (cross well seismic), on the surface [reverse VSP (RVSP)], or in the same well at some distance from the source (single-well imaging). Images have been created from data collected in experiments using such tool placement, and the time required for acquisition, the time required for data processing, and the cost of the entire operation have all dropped to a point where the techniques may be considered commercially, not just experimentally. A few years ago, the only cross well seismic technique in use was tomography which, while providing a valid representation of the velocity of the interval region, did not provide a detailed image. Currently, topographic techniques are often used to provide the velocity information for the production of a highly detailed reflection image between (and beneath) the two wells in cross well reflection programs. Sources powerful enough to provide useful RVSP data have only recently become available, but a few early studies indicate that the potential for such technology is tremendous for imaging detailed structure in the vicinity of a well. Single-well imaging, although not yet widespread, may provide a useful tool for detailed close-up structural studies, such as salt proximity studies designed to assist in the planning of a development sidetrack from an exploration well, particularly in the deepwater environment.
PASSIVE SEISMIC MONITORING
In recent years, the mechanical response of reservoir host rocks has been studied in some detail, prompted in part by the dramatic subsidence observed at the Ekofisk platform in the North Sea, although studies relating earthquakes to oil and gas production and injection practices had previously been published in the scientific and earthquake literature. Earthquake monitoring (called passive monitoring because the geophysicist does not activate a seismic source) has become more precise and accurate, even at low levels of seismicity, largely due to the placement of geophones down hole, away from surface noise and closer to the sources of seismic energy. As reservoir host rocks are stressed during the production (and/or injection) of fluids and the accompanying changes in fluid pressure, small (and occasionally large) earthquake like events occur, representing shear failure along planes of weakness; these can occur at pressures well below the reservoir-engineer’s “parting” pressure for tensile failure. In some detailed studies, very small events seem to indicate patterns and locations of fracture systems responsible for oil migration. Passive seismic monitoring and surface tilt observations during hydraulic fracturing have led to improved reservoir development in a number of cases. Both techniques of hydraulic-fracture monitoring have become nearly routine in the industry (that is, they are no longer experimental) and can be applied where appropriate.
• What is required model resolution
• Depositional environment variables
• Are facies proportions available
• Seismic-log calibration available
• Porosity and permeability relationship
• Net pay cutoff
• Horizons/faults surfaces generation
• 3D stratigraphic grid construction
• Spatial statistic analysis
• Facies distribution
• Distribution of seismic/petro relationships
• Permeability distribution
• Net-to-Gross estimation
The concept of petroleum reservoir geophysics is relatively new. In the past, the role of geophysics was largely confined toexploration and, to a lesser degree, the development of discoveries. As cost-efficiency has taken over as a driving force in the economics of the oil and gas industry and as major assets near abandonment, geophysics has increasingly been recognized as a tool for improving the bottom line closer to the wellhead. The reliability of geophysical surveys, particularly seismic, has greatly reduced the risk associated with drilling wells in existing fields, and the ability to add geophysical constraints to statistical models has provided a mechanism for directly delivering geophysical results to the reservoir engineer.
DIFFERENCES BETWEEN EXPLORATIONS AND RESERVOIR GEOPHYSICS
There are several specific differences between explorations geophysics and reservoir geophysics, as the term is usually intended. These include the assumption that well control is available within the area of the geophysical survey, that a well designed geophysical survey can be conducted at a level of detail that will be useful, and that some understanding of the rock physics is available for interpretation.
WELL CONTROL
In exploration, we often require extrapolating well data from far outside the area of interest, crossing faults, sequence boundaries, and occasionally worse discontinuities. The availability of “analogs” is an important component of exploration, and the level of confidence on the resulting interpretation is necessarily limited. In reservoir geophysics, it is generally assumed that a reservoir is already under production (or at least at a late stage of development) and that wells are available for analysis.
These wells provide a variety of information. From the petro physicist, we receive edited and interpreted well log data, describing the lithology (including the mineralogy, porosity, and Michigan Technological University, Department of Geological perhaps even the morphology of the pore spaces), the fluid content (sometimes related to logged conditions, sometimes to virgin reservoir conditions), and detailed depth constraints on geologic horizons. From the production and reservoir engineers, we receive an estimate of the proximity to boundaries, aquifers, or other features of interest. The reservoir engineer can also provide a good estimate of the total volume of the reservoir, and the asset team relates this to the geologic interpretation, determining the need for surveys at increased resolution. From a combination of sources, we obtain additional information about the in-situ conditions of the reservoir, including the formation temperature, pressure, and the properties of the oil, gas, and brine. The geophysicist should be familiar with the usefulness and limitations of petrophysical and reservoir engineering studies, and should be able to ask intelligent questions of the experts in those fields. But the geophysicist need not become an expert in those areas in order to work with the specialists and to design a new experiment to solve reservoir problems.
ROCK PHYSYCS CONTROL
One of the major questions a geophysicist is asked, or should ask independently, is this: Will the geophysical technique being proposed be able to differentiate between the competing reservoir models sufficiently well to be worth the effort and cost? The answer lies not just in the geophysical model, but in the rock physics—or the “seismic petro physics”—of the reservoir rock and neighboring formations. The presence of wells and the possibility that some core samples are available greatly improve the capability of the reservoir geophysicist to address this question. Logs, particularly sonic logs of compression and shear velocities combined with image logs providing fracture information, can be used (carefully) to provide basic seismic properties, which in turn are modeled for varying litho logic character, fluid content, and in-situconditions (such as pore pressure). The core samples can be used to provide the basis for a theoretical framework, or measurements on them can be used (again, carefully) to provide the same basic seismic properties. The geophysicist must always be on the alert for accidental misuse of the input data, and concerned with scaling properties, particularly the possibility that physical effects observed at one scale (such as the squirt flow mechanism for saturated rocks at high frequencies) not be mistakenly applied at other scales. Sometimes, a little knowledge can be a dangerous weapon; an incomplete evaluation of the seismic petrophysical aspects of the formation can lead either to incorrect results or interpretations.
SURVEY DESIGN
Once a field has been discovered, developed, and under production for some time, quite a bit of information is available to the geophysicist to design a geophysical survey in such a manner as to maximize the likelihood that the data collected will optimize the interpretation. That is, if the goal of the survey is to define the structural limits of the field, a 3-D seismic survey can be designed with that in mind. If, however, the goal of the survey is to define the extent of a gas zone, the geophysicist may be able to use log data, seismic petrophysical modeling, and old (legacy) seismic data to determine whether a certain offset range is required to differentiate between the water and gas zones. If highly accurate well ties or wavelet-phase control are needed, an appropriately placed vertical seismic profile (VSP) may be designed. Or, if an acquisition footprint had been observed in a previously acquired seismic data set and that footprint obscured the attributes used to define the reservoir target, the geophysicist can design the new survey to eliminate the troublesome artifacts. In short, the fact that the target is well known gives the reservoir geophysicist a distinct advantage over the exploration geophysicist by allowing the survey to be designed in a more enlightened manner than a typical exploration survey ever can be. It is often easier to justify the expense of a properly conducted seismic survey for reservoir characterization purposes because the financial impact of the survey can be calculated with greater confidence and the financial returns realized more quickly than is typically the case for exploration seismic surveys .Procedures for planning 3-D seismic surveys have been undergoing rapid change over the past few years,
3-D SEISMIC
Most reservoir geophysics is based on reflection seismic data, although a wide variety of other techniques are employed regularly on specific projects. Almost all seismic data collected for reservoir studies is high-fold 3-D vertical-receiver data; however, the use of converted-wave data with multiple component geophones on land and on the sea floor, and multi component source (on land) is increasing. In particular, in order to image below gas clouds that obscure P-wave imaging of reservoirs, converted waves are now being used, and the technology to obtain multiple-component data from the ocean bottom is continually improving. The importance of fractures in many reservoir development schemes has led to a number of experimental programs for multi component sources and receivers in an effort to identify shear-wave splitting (and other features) associated with high fracture density. Some of these techniques will find continually increasing application in the future, but at the present, most surface seismic studies designed to characterize existing reservoirs are high-quality 3-D vertical-component receiver surveys.
ATTRIBUTES
In most exploration and reservoir seismic surveys, the main objectives are (in order) to correctly image the structure in time and depth, and to correctly characterize the amplitudes of the reflections in both the stacked and pre stack domains. From these data, a host of additional features can be derived, and used in interpretation. Collectively, these features are referred to as seismic attributes. The simplest attribute, and the one most widely used, is seismic amplitude, and it is usually reported as the maximum (positive or negative) amplitude value at each common midpoint (CMP) along a horizon picked from a 3-D volume. It is fortunate that, in many cases, the amplitude of a reflection corresponds directly to the porosity of the underlying formation, or perhaps to the density (and compressibility) of the fluid occupying pore spaces in that formation. The assumption is that amplitude is proportional to RO, and the simple convolution model is often appropriate for interpretation of the data in such cases. But it isn’t always this simple, and many mistakes of interpretation have occurred by making this assumption. For one thing, the convolution model may not be appropriate for use in many instances, particularly if the offset dependence of a reflection is important in its interpretation. Likewise, the interpretation of porosity or fluid properties as the cause of a true impedance change is often overly optimistic, especially in sands containing clays or in rocks with fractures.
The use of seismic attributes extends well beyond simple amplitudes. Most of the “original” seismic attributes were based on the Hilbert transform and consisted of the instantaneous amplitude (or amplitude of the wave envelope), the instantaneous phase (most useful for accurate time picking), and the instantaneous frequency (probably most often associated with thin-bed reverberations, but often interpreted, perhaps incorrectly, as resulting from attenuation due to gas bubbles). Variations on these attributes evolved, and other classes of attributes came into use. For example, coherence is the attribute of waveform similarity among neighboring traces and is often used to identify fractures (Marfurt et al., 1998). Dip and azimuth describe the direction of trace offset for maximum similarity and can yield finely detailed images of bed surfaces. There are now over two hundred attributes in use in some geophysical processing or interpretation software packages; many of these attributes result from slightly differing approaches to determining a specific property, such as frequency or amplitude. Care must be taken in applying traditional attribute analysis in thin-bed areas, where the interference from the thin beds themselves can obscure the traditional attribute interpretations.
WELL CALIBRATION
With so many attributes available to choose from, it is vital that the reservoir geophysicist make careful use of calibration at wellbores, using log data, core data, and borehole seismic information available in order to test the correlation of attributes with rock properties. Again, the reservoir geophysicist enjoys significant advantages over the exploration geophysicist, who cannot always tie the seismic data and its character (attributes) to properties of the formation as evidenced from the well data. It is important that the reservoir geophysicist make use of all the information and expertise available within the asset team to provide the tightest possible calibration; otherwise, the advantage of performing reservoir geophysical studies is lost. It is simple to correlate the attribute of interest with the well-log (or log-derived) data of interest; a strong correlation between, say, seismic amplitude and porosity is often enough to convince many workers that the correlation is meaningful and that seismic amplitude can be used as a proxy for porosity in reservoir characterization. There are many potential pitfalls in this approach, as one may imagine. Statistical tests should be performed on the well correlations, and geologic inference should be brought in to test the reasonableness of the results and, most importantly, the physical basis for the behavior of an observed attribute.
GEOSTATISTICS
In reservoir characterization, the asset team usually has a number of wells at its disposal from which to draw inferences about the reservoir in general. With the availability of these wells comes a dilemma: How do you make use of the spatial distribution of the data at hand? Simple averaging between wells can easily be seen to lead to misleading results, and a technique called kriging was developed for use when features can be observed to correlate over certain distances. The technique has been refined to include the use of data that provides additional “soft” evidence between the “hard” data locations at wells, and seismic data often provides that soft evidence. Essentially, if a statistical (and physically meaningful) correlation is found to exist between formation parameters observed at wells and some seismic attribute observed throughout the study area, geostatistical techniques are available that allow the hard data at the wells to be honored and to be interpolated (generally using kriging techniques) between the wells, while honoring the seismic interpretation to a greater or lesser degree. In the absence of seismic data, various “realizations” of the possible inter well regions can be generated using advanced geostatistical techniques, each realization being just as likely to occur as any other. But in the presence of seismic data with reliable predictive capabilities, the range of such models can be greatly reduced. The problem of reservoir characterization then can become less stochastic and more deterministic, although the correlations are never perfect, and a range of likely models should always be considered.
ULTRA-THIN BEDS
recent years, a couple of techniques in particular have been developed that appear to help the interpreter identify properties of extremely thin beds, well below what has traditionally been considered the quarter-wavelength resolution of seismic data. These techniques make use of the various frequency components within a band-limited seismic wavelet; one operates in the frequency domain, and the other in the time domain. The frequency-domain approach called spectral decomposition, looks for notches in the frequency band representing a sort of ghost signal from the interference of the reflections from the top and bottom of the thin bed. The frequency at which that ghost, or spectral notch, occurs corresponds to twice the (two-way) time thickness of the bed. Because the seismic wavelet contains frequencies well above the predominant frequency, spectral notches can be indicative of extremely thin beds. The thinning out of a channel or shoreline, for example, can be observed by mapping the locations of successively higher-frequency notches in the spectrum. The time-domain approach involves matching wavelet character, often using a neural-network technique; the wavelet along a given horizon can be classified into several different wavelets, perhaps differing from each other only in subtle ways. The resulting map of classified wavelets can often resemble a map of the geologic feature being sought. The classification tends to compare relative amplitudes (side lobes versus main lobes, for example), “shoulders” on a main peak or trough, or slight changes in period, for example, and therefore often responds to interference from features below wavelet resolution. Both of these techniques run the risk of leading to incorrect interpretations if seismic petrophysical modeling is not performed to direct the analysis and interpretation or to confirm the results. It is becoming increasingly easy for a reservoir geophysicist to make use of advanced computer programs as black boxes that provide a pretty picture and thereby be lulled into a false sense of security in the interpretation. Fortunately, most software packages currently available include the modeling capabilities required to test the results, but the tests are only as complete as the reservoir geophysicist is able to make them.
FOCUSED APPROAVHES
Because the good reservoir geophysicist has analyzed the target of the study, has calibrated legacy seismic data to wells, and has investigated the seismic petrophysical responses of the various scenarios anticipated in the reservoir, there is an opportunity to collect that data, and only that data, which will be required to observe the features of interest. For example, one could collect, say, only far-offset seismic data if one were convinced that the far offsets contained all the information that was essential to the study. It is not clear that such highly focused approaches are being used, however, probably because the cost savings do not warrant the added risk of missing an important piece of data. There may also be a natural aversion to collecting, purposefully, data that are not as “good” or “complete” as conventionally acquired seismic data, even though this approach would be a good marriage of the scientific method (collect data that is designed to support or disprove a hypothesis) and engineering pragmatism (get the job done, and produce hydrocarbons in a timely and efficient manner).
BOREHOLE GEOPHYSICS
The reservoir geophysicist not only has the advantage of using well data for correlation, the advantage extends to using those wells for the collection of novel geophysical data, from below the noisy surface or weathered zone, and very close to the target itself. New techniques for acquisition of seismic data from within wellbores are available, and may become important tools in the arsenal of the reservoir geophysicist in the near future. The seismic sources and/or receivers can be placed in one well or in neighboring wells or on the surface, and the object of the analysis can be either the velocity field or the detailed reflection image near the wells. In order to qualify as borehole geophysics, either the source or the receiver, at least, must be in a wellbore; beyond that, almost as many geometrical arrangements as can be imagined have been tested or seriously proposed.
VSPs,CHECK SHOTS, SONIC LOGGING AND THROUGH-CASING SONIC LOGGING
The more conventional borehole geophysical techniques include VSPs, check shot surveys, traditional sonic logging, and sonic logging through casing. All of these techniques were developed primarily to assist in the tie between surface seismic data and well observations, but they have been extended beyond that in many cases. VSPs provide the best data for detailed event identification and wavelet determination (including phase); but they can also be used to image the near-wellbore environment, and the image can be improved if a number of offsets are used for the source location. Modern sonic logging tools can provide a good measure of compression and shear velocities, values required for the calibrated study of the effect of fluid substitution on seismic data; of course, the interpreter must be careful to know if the data represent invaded or un invaded conditions, and make appropriate corrections if necessary. And modern sonic logging tools can often provide reliable values for velocities through casing; often, the most reliable figures for soft shale can only be found behind casing due to the inability to log open-hole the depths in which shale are flowing or collapsing.
CROSS WELL, RVSP, AND SINGLE-WELL IMAGING
Recent extensions of borehole geophysical techniques involve placing a powerful seismic source in one well; the receivers may be in another well (cross well seismic), on the surface [reverse VSP (RVSP)], or in the same well at some distance from the source (single-well imaging). Images have been created from data collected in experiments using such tool placement, and the time required for acquisition, the time required for data processing, and the cost of the entire operation have all dropped to a point where the techniques may be considered commercially, not just experimentally. A few years ago, the only cross well seismic technique in use was tomography which, while providing a valid representation of the velocity of the interval region, did not provide a detailed image. Currently, topographic techniques are often used to provide the velocity information for the production of a highly detailed reflection image between (and beneath) the two wells in cross well reflection programs. Sources powerful enough to provide useful RVSP data have only recently become available, but a few early studies indicate that the potential for such technology is tremendous for imaging detailed structure in the vicinity of a well. Single-well imaging, although not yet widespread, may provide a useful tool for detailed close-up structural studies, such as salt proximity studies designed to assist in the planning of a development sidetrack from an exploration well, particularly in the deepwater environment.
PASSIVE SEISMIC MONITORING
In recent years, the mechanical response of reservoir host rocks has been studied in some detail, prompted in part by the dramatic subsidence observed at the Ekofisk platform in the North Sea, although studies relating earthquakes to oil and gas production and injection practices had previously been published in the scientific and earthquake literature. Earthquake monitoring (called passive monitoring because the geophysicist does not activate a seismic source) has become more precise and accurate, even at low levels of seismicity, largely due to the placement of geophones down hole, away from surface noise and closer to the sources of seismic energy. As reservoir host rocks are stressed during the production (and/or injection) of fluids and the accompanying changes in fluid pressure, small (and occasionally large) earthquake like events occur, representing shear failure along planes of weakness; these can occur at pressures well below the reservoir-engineer’s “parting” pressure for tensile failure. In some detailed studies, very small events seem to indicate patterns and locations of fracture systems responsible for oil migration. Passive seismic monitoring and surface tilt observations during hydraulic fracturing have led to improved reservoir development in a number of cases. Both techniques of hydraulic-fracture monitoring have become nearly routine in the industry (that is, they are no longer experimental) and can be applied where appropriate.