ProMAX3D Tutorial
ProMAX3D Tutorial
626072 Rev. C
March 1999
Copyright 1999 Landmark Graphics Corporation All Rights Reserved Worldwide This publication has been provided pursuant to an agreement containing restrictions on its use. The publication is also protected by Federal copyright law. No part of this publication may be copied or distributed, transmitted, transcribed, stored in a retrieval system, or translated into any human or computer language, in any form or by any means, electronic, magnetic, manual, or otherwise, or disclosed to third parties without the express written permission of: Landmark Graphics Corporation 15150 Memorial Drive, Houston, TX 77079, U.S.A. Phone: 713-560-1000 FAX: 713-560-1410
Trademark Notices Landmark, OpenWorks, SeisWorks, ZAP!, PetroWorks, and StratWorks are registered trademarks of Landmark Graphics Corporation. Pointing Dispatcher, Log Edit, Fast Track, SynTool, Contouring Assistant, TDQ, RAVE, 3DVI, SurfCube, SeisCube, VoxCube, Z-MAP Plus, ProMAX, ProMAX Prospector, ProMAX VSP, MicroMAX, and Landmark Geo-dataWorks are trademarks of Landmark Graphics Corporation. Technology for Teams is a service mark of Landmark Graphics Corporation. ORACLE is a registered trademark of Oracle Corporation. IBM is a registered trademark of International Business Machines, Inc. AIMS is a trademark of GX Technology. Motif, OSF, and OSF/Motif are trademarks of Open Software Corporation. UNIX is a registered trademark of UNIX System Laboratories, Inc. SPARC, SPARCstation, Sun, SunOs and NFS are trademarks of SUN Microsystems. X Window System is a trademark of the Massachusetts Institute of Technology. SGI is a trademark of Silicon Graphics Incorporated. All other brand or product names are trademarks or registered trademarks of their respective companies or organizations.
Note The information contained in this document is subject to change without notice and should not be construed as a commitment by Landmark Graphics Corporation. Landmark Graphics Corporation assumes no responsibility for any error that may appear in this manual. Some states or jurisdictions do not allow disclaimer of expressed or implied warranties in certain transactions; therefore, this statement may not apply to you.
Contents
Agenda
. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-1
Monday . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-1
Introductions, Course Outline, and Miscellaneous Topics . . . . . . . . . . . . . . . System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ProMAX 3D Geometry. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Discussion of 3D Tutorial Project. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Initial Look at the Trace Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Build 3D Database from Observers Notes . . . . . . . . . . . . . . . . . . . . . . . . Geometry Core Path Overview. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .......................................................... Agenda-1 Agenda-1 Agenda-1 Agenda-1 Agenda-1 Agenda-1 Agenda-1 Agenda-1
Tuesday . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-2
Database From Geometry Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Processing Sequence Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preprocessing and Elevation Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Superswath Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3D Stack and Volume Comparison. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3D Stack Volume Displays. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-2 Agenda-2 Agenda-2 Agenda-2 Agenda-2 Agenda-2
Wednesday . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-3
3D Mix . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3D Stack Comparisons . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ProMAX Marine 3D Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Neural Net First Break Picking. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Source Receiver Geometry Check . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3D Refraction Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Statistical Trace Editing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-3 Agenda-3 Agenda-3 Agenda-3 Agenda-3 Agenda-3 Agenda-3
Thursday . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-4
3D Residual Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-4 Velocity Analysis and the Volume Viewer . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-4 ProMAX Land Swath Geometry . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-4
Friday . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-5
3D Dip Moveout . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Agenda-5
Landmark
Contents
CDP Taper on Stack Data. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3D Velocity Viewer/Editor. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Land Geometry Using SPS Survey Data . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Preface . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Preface-1 About The Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface-1 How To Use The Manual . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface-1
Conventions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface-2
Mouse Button Help. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface-2 Exercise Organization. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Preface-2
System Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
1-1
ii
Landmark
Contents
Landmark
iii
Contents
Patterns Spreadsheet . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Complete the Sources Spreadsheet. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . CDP Binning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Interactive Spread QC using XYgraph . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Defining the CDP binning grid. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Interactive Grid Definition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . CDP Bin Application . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . QC Binning with Fold Plot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
Final QC Plots from the Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-31 Load the Geometry to the SEGY Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-33 Graphical Geometry QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-35
Graphical Geometry QC Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2-35
3-1
Contents
Load Geometry to the Trace Headers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-27 Append the Second SEGY File . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-29
Run the Second Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-29
Load Geometry to the Trace Headers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-42 Exercise Summary . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-44 Full Extraction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-45
Create a New Line and Run the First Extraction . . . . . . . . . . . . . . . . . . . . . . . 3-45 Edit the LIN Database. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 3-47
Landmark
Contents
4-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-1 Main Process Flow . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-2 Detailed Process Flow - Trace Processing . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-3 Detailed Process Flow - DMO and Vel-Anal . . . . . . . . . . . . . . . . . . . . . . . . . . . 4-4 Detailed Process Flow - DMO Stack and Migration . . . . . . . . . . . . . . . . . . . . 4-5
. . . . . . . . . . . . . . . . . . . . . . . . . . 5-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-1 Top Mute and Decon Design Gate Picking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2
Identifying Analysis Locations. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5-2 Pick a Top Mute and Miscellaneous Time Gate. . . . . . . . . . . . . . . . . . . . . . . . . 5-5
6-1
Landmark
Contents
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-1 3D RMS Velocity Field ASCII Import . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-2 3D Parameter Table Interpolation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-8 Picking a Post NMO Mute . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-10 Stack 3D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-15
Run Stack3D on the First Superswath . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-16 Run Stack3D on the Other Superswath . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6-18
Landmark
vii
Contents
Neural Network FB Training . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-3 Neural Network First Break Picking . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-10
Using the First Break Pick Macro for QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . 7-13
8-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-1 Source Receiver Geometry Check . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2
Example Flow. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Offset Range from First Break Pick Plot . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Offset Range from Trace Display. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Database Values . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Analyze the Results using Simple Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Analyze the Results using 3D Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8-2 8-3 8-3 8-4 8-5 8-5
Refraction Statics . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
9-1
viii
Landmark
Contents
Contents
QC the Picks from the First Half . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-11 Correlate the Second Half. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-12 QC the Picks from the Second Half . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-13
Merging the Partial Stacks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-26 Eigen Stack Model Building . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-28
Build the Eigen Stack External Model . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 11-29
12-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-1 Velocity Analysis Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-2 3D Supergather Generation and QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12-3
Supergather Generation and Offset Distribution QC . . . . . . . . . . . . . . . . . . . . 12-4
Landmark
Contents
DMO to Gathers 3D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-9 Parallel Processing Overview . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-15 System Administration for Parallel Processing . . . . . . . . . . . . . . . . . . . . . . . 13-17
PVM Daemon (Parallel Virtual Machine) . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-17 User environment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-17
Using DIPVELS for Zero Dip Velocity Estimation . . . . . . . . . . . . . . . . . . . 13-19 DMO Stack 3D . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-22
Run DMO to Stack3D with the rekill switch set to NO . . . . . . . . . . . . . . . . . 13-27 Compare the two DMO Stack Volumes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13-28
Contents
Migration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
16-1
17-1
Topics covered in this chapter: . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-1 3D Marine Geometry from UKOOA Data . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-2
Using the Marine 3D Geometry Spreadsheet . . . . . . . . . . . . . . . . . . . . . . . . . . 17-2 Determine Primary Azimuth for Binning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-5 Cable Feather QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-6 CDP Binning. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-7 QC the Calculated Grid . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-10
xii
Landmark
Contents
Interactive Grid QC and Alteration. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . Load Final Grid and Perform CDP Binning . . . . . . . . . . . . . . . . . . . . . . . . . . Critical Parameters During CDP Binning. . . . . . . . . . . . . . . . . . . . . . . . . . . . Receiver Binning . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . QC the CDP Binned Data using a Fold Plot . . . . . . . . . . . . . . . . . . . . . . . . . . Finalize the Database . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .
QC Plots . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-26
Produce QC plots from the database. . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-27 CDP Contribution and Null QC . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17-28
Contents
xiv
Landmark
Agenda
Monday
Introductions, Course Outline, and Miscellaneous Topics
Differences between 2D and 3D What makes 3D different from 2D in physical geometric terms? What is different in processing 3D data relative to 2D?
System Overview
Directory Structure Program Execution Ordered Parameter Files Parameter Tables Disk Datasets Tape Datasets
ProMAX 3D Geometry
Discussion of 3D Tutorial Project Initial Look at the Trace Data Build 3D Database from Observers Notes Input data into the spreadsheet QC features within the spreadsheet/database CDP Binning Loading Geometry Directly to Field Data Graphical Geometry QC
Landmark
Agenda-1
Agenda
Tuesday
Database From Geometry Extraction
Extraction of rst half Extraction of second half Full Extraction Processing without a Database
Agenda-2
Landmark
Agenda
Wednesday
3D Mix
Apply a 3D Running Mix to the Initial Stack
3D Stack Comparisons
Compare Inlines from Two Stack Volumes Compare Crosslines from Two Stack Volumes Compare Time Slices from Two Stack Volumes
ProMAX Marine 3D Geometry Neural Net First Break Picking Source Receiver Geometry Check
Use First Breaks to check shot and receiver coordinates
3D Refraction Statics
Compute Refraction Statics Apply Refraction Statics
Landmark
Agenda-3
Agenda
Thursday
3D Residual Statics
F-XY Decon Model Building Cross Correlation Gate Picking Pick the Autostatics Correlation Gate Cross Correlation Computation External Model Autostatics Computation Eigen Stack Model Building Residual Static Application and Stack
Agenda-4
Landmark
Agenda
Friday
3D Dip Moveout
Offset Binning Parameter Determination DMO to Gathers 3D Parallel Processing Overview DMO Stack 3D
Migration
Stolt 3D Migration Phase Shift Migration PSPC 3D Depth Migration Explicit FD 3D Time Migration Explicit FD 3D Depth Migration
Landmark
Agenda-5
Agenda
Agenda-6
Landmark
Preface
About The Manual This manual is intended to accompany the instruction given during the standard ProMAX 3D User Training course. Because of the power and flexibility of ProMAX 3D, it is unreasonable to attempt to cover all possible features and applications in this manual. Instead, we try to provide key examples and descriptions, using exercises which are directed toward common uses of the system. The manual is designed to be flexible for both you and the trainer. Trainers can choose which topics, and in what order to present material to best meet your needs. You will find it easy to use the manual as a reference document for identifying a topic of interest and moving directly into the associated exercise or reference.
How To Use The Manual This manual is divided into chapters that discuss the key aspects of the ProMAX 3D system. In general, chapters conform to the following outline: Introduction: A brief discussion of the important points of the topic and exercise(s) contained within the topic. Topics Covered in Chapter: Brief list of skills or processes in the order that they are covered in the exercise. Topic Description: More detail about the individual skills or processes covered in the chapter. Exercise: Details pertaining to each skill in an exercise, along with diagrams and explanations. Examples and diagrams will assist you during the course by minimizing note taking requirements and providing guidance through specic exercises.
This format allows you to glance at the topic description to either quickly reference an implementation or simply as a means of refreshing your memory on a previously covered topic. If you need more information, see the Exercise sections of each topic.
Landmark
Preface-1
Preface
Conventions
Mouse Button Help This manual does not refer to using mouse buttons unless they are specific to an operation. MB1 is used for most selections. The mouse buttons are numbered from left to right so: MB1 refers to an operation using the left mouse button. MB2 is the middle mouse button. MB3 is the right mouse button. Actions that can be applied to any mouse button include: Click: Briey depress the mouse button. Double Click: Quickly depress the mouse button twice. Shift-Click: Hold the shift key while depressing the mouse button. Drag: Hold down the mouse button while moving the mouse.
Mouse buttons will not work properly if either Caps Lock or Nums Lock are on.
Exercise Organization Each exercise consists of a series of steps that will build a flow, help with parameter selection, execute the flow, and analyze the results. Many of the steps give a detailed explanation of how to correctly pick parameters or use the functionality of interactive processes. The editing flow examples list key parameters for each process of the exercise. As you progress through the exercises, familiar parameters will not always be listed in the flow example. The exercises are organized such that your dataset is used throughout the training session. Carefully follow the instructors direction when assigning geometry and checking the results of your flow. An improperly generated dataset or database may cause a subsequent exercise to fail.
Preface-2
Landmark
Preface
What is different in processing of 3D data relative to 2D? 3D Subsurface Binning 3D Geometry QC procedures 3D Stack comparison techniques (header tricks) inline - crossline - time slice plots 3D Refraction Statics --- program 3D Residual Statics model building correlation gate picking
Landmark
Preface
Preface-4
Landmark
Chapter 1
System Overview
In this chapter we discuss some of the behind-the-scenes system operation and the basic ProMAX framework. Understanding the ProMAX framework and its relationship to the UNIX directory structure can be useful, especially when processing large volumes.
Landmark
1-1
Directory Structure
/ProMAX (or $PROMAX_HOME) The directory structure begins at a subdirectory set by the $PROMAX_HOME environmental variable. This variable defaults to / advance, and is used in all the following examples. Set the $PROMAX_HOME environment variable to /my_disk/my_world/ProMAX to have your ProMAX directory tree begin below the /my_disk/my_world subdirectory. All ProMAX development tools are included within the following subdirectories: $PROMAX_HOME/sys/lib, $PROMAX_HOME/sys/ obj, $PROMAX_HOME/port/src, $PROMAX_HOME/port/bin, $PROMAX_HOME/port/include and $PROMAX_HOME/port/man.
$PROMAX_HOME/sys Software that is Operating System Specific resides in $PROMAX_HOME/sys which is actually a symbolic link to subdirectories unique to a given hardware platform, such as: $PROMAX_HOME/rs6000 for IBM RS6000 workstations, $PROMAX_HOME/solaris for Sun Microsystems Sparcstations and Cray 6400 workstations running Sun Solaris OS, $PROMAX_HOME/sgimips for Silicon Graphics workstations using the 32 bit operating system and $PROMAX_HOME/sgimips4 for Silicon Graphics workstations using the 64 bit operating system. This link facilitates a single file server containing executable programs and libraries for all machine types owned by a client. Machine specific executables are invoked from the UNIX command line, located in $PROMAX_HOME/sys/bin. Operating System specific executables, called from ProMAX, are located under $PROMAX_HOME/sys/exe. These machinedependent directories are named after machine type, not manufacturer, to permit accommodation of different architectures from the same vendor. Accommodating future hardware architectures will simply
1-2
Landmark
involve addition of new subdirectories. Unlike menus, help and miscellaneous files, a single set of executables is capable of running all ProMAX products, provided the proper product specific license identification number is in place. Third party software distributed by ProMAX will now be distributed in a subdirectory of $PROMAX_HOME/sys/exe using the companys name, thus avoiding conflicts where two vendors use identical file names. For example, SDIs CGM Viewer software would be in $PROMAX_HOME/sys/exe/sdi and Frame Technologys FrameViewer would be in $PROMAX_HOME/sys/exe/frame.
Landmark
1-3
/sys
/exe exec.exe super_exec.exe *.exe from program /bin *.exe from command line
/port
/promax
/etc
*.lok - Frame help /lib/X11/app-defaults *.help -ASCII help Application window /promax3d managers /promaxvsp /menu /promax *.menu Processes /promax3d /promaxvsp /misc *_stat_math *.rgb-colormaps ProMax_defaults /bin start-up executable
1-4
Landmark
$PROMAX_HOME/port Software that is Portable across all Platforms is grouped under a single subdirectory $PROMAX_HOME/port. This includes: menus and Processes ($PROMAX_HOME/port/menu), helpfiles ($PROMAX_HOME/port/help), miscellaneous files ($PROMAX_HOME/port/misc.) Under the menu and help subdirectories are additional subdirectories for each ProMAX software product. For instance, under $PROMAX_HOME/port/menu, you will find subdirectories for ProMAX 2D (promax), ProMAX 3D (ProMAX3D), and ProMAX VSP (ProMAXVSP.) Menus for additional products are added as new subdirectories under $PROMAX_HOME/port/menu. If your system administrator is not afraid of the LISP programming language you can have them customize the ProMAX menus and defaults. The ...$PROMAX_HOME/port/bin contains the file Promax which is the ProMAX start-up script. You may want to edit this file and personalize it to your environment. The ...$PROMAX_HOME/port/lib/X11/app-defaults contains the color attributes and window configurations for the individual applications.
$PROMAX_HOME/etc Files unique to a particular machine are located in the $PROMAX_HOME/etc subdirectory. Examples of such files are the config_file, which contains peripheral setup information for all products running on a particular machine, and the product file, which assigns unique pathnames for various products located on the machine.
$PROMAX_HOME/scratch The scratch area defaults to $PROMAX_HOME/scratch. This location can be overridden with the environmental variable, PROMAX_SCRATCH_HOME.We recommend you point this to the biggest file system you have write permission. The DMO, Migrations, and Spreadsheets are heavy users of this file system. We also recommend that you periodically clean this file system.
Landmark
1-5
$PROMAX_HOME/data (or $PROMAX_DATA_HOME) The primary data partition defaults to $PROMAX_HOME/data, with new areas being added as subdirectories beneath this subdirectory. This default location is specified using the entry: primary disk storage partition: $PROMAX_HOME/data 200 in the $PROMAX_HOME/etc/config_file. This location can also be set with the environmental variable $PROMAX_DATA_HOME.We also recommend that you point this to a large files system you can write to
1-6
Landmark
Data Directories
Each region identifies a collection of files and directories which can be summarized as the Area and Line separated into four main file types: 1) Parameter Tables, 2) Trace/Trace Headers, 3) Flows, and 4) Ordered Parameter Files database.
PROMAX_DATA_HOME
/Line DescName 17968042TVEL 31790267TGAT 36247238TMUT 12345678CIND 12345678CMAP /12345678 HDR1 HDR2 TRC1 TRC2 /Flow1 DescName TypeName job.output packet.job /OPF.SIN
OPF60_SIN.GEOMETRY.ELEV
/OPF.SRF
#s0_OPF60_SRF.GEOMETRY.ELEV
Landmark
1-7
Program Execution
User Interface ($PROMAX_HOME/sys/bin/promax) Interaction with ProMAX is handled through the User Interface. As you categorize your data into Areas and Lines, the User Interface automatically creates the necessary UNIX subdirectories and provides an easy means of traversing this data structure. However, the primary function of the User Interface is to create, modify, and execute processing flows. A flow is a sequence of processes that you perform on seismic data. Flows are built by selecting processes from a list, and then selecting parameters for each process. A typical flow contains an input process, one or more data manipulation processes, and a display and/or output process. All information, needed to execute a flow, is held within a Packet File (packet.job) within each Flow subdirectory. This Packet File provides the primary means of communication between the User Interface and the Super Executive program. See next section, Super Executive Program. In addition, the User Interface provides utility functions for: copying, deleting and archiving Areas, Lines, Flows, and seismic datasets accessing and manipulating ordered database les and parameter tables displaying processing histories for your ows providing information about currently running jobs
The User Interface is primarily mouse-driven and provides point-andclick access to the functions.
1-8
Landmark
Program Execution
Super Executive Program (super_exec.exe) Execution of a flow is handled by the Super Executive, which is launched as a separate task by the User Interface. The Super Executive is a high level driver program that examines processes in your flow by reading packet.job and determines which executables to use. The majority of the processes are subroutines linked together to form the Executive. Since this is the processing kernel for ProMAX, many of your processing flows, although they contain several processes, are handled by a single execution of the Executive. Several of the processes are stand-alone programs. These processes cannot operate under the
Landmark
1-9
control of the Executive, and handle their own data input and output by directly accessing external datasets. In these instances, the Super Executive is responsible for invoking the stand-alone programs and, if necessary, multiple calls to the Executive in the proper sequence. The Packet File, packet.job, defines the processes and their type for execution. The Super Executive concerns itself with only two types of processes: Executive processes Stand-alone processes
Executive processes are actually subroutines operating in a pipeline, meaning they accept input data and write output data at the driver level. However, stand-alone processes cannot be executed within a pipeline, but rather must obtain input and/or produce output by directly accessing external datasets. The Super Executive sequentially gathers all Executive-type processes until a stand-alone is encountered. At that point, the Packet File information for the Executive processes is passed to the Executive routine (exec.exe) for processing. Once this is completed, the Super Executive invokes the stand-alone program for processing, and then another group of Executive processes, or another stand-alone process. This continues until all processes in the flow have been completed.
Executive Program (exec.exe) The Executive program is the primary processing executable for ProMAX. The majority of the processes available under ProMAX are contained in this one executable program. The Executive features a pipeline architecture that allows multiple seismic processes to operate on the data before it is displayed or written to a dataset. Special processes, known as input and output tools, handle the tasks of reading and writing the seismic data, removing this burdensome task from the individual processes. This results in processes that are easier to develop and maintain. The basic flow of data through the Executive pipeline is shown below:
1-10
Landmark
Processing Pipeline Each individual process will not operate until it has accumulated the necessary traces. Single trace processes will run on each trace as the traces come down the pipe. Multi channel processes will wait until an entire ensemble is available. For example in the example flow the FK
Landmark
1-11
filter will not run until one ensemble of traces has passed through the DDI and AGC. If we specify for the Trace Display to display 2 ensembles, it will not make a display until two shots have been processed through the DDI, AGC and FK filter. No additional traces will be processed until Trace Display is instructed to release the traces that it has displayed and is holding in memory by clicking on the traffic light icon or terminating its execution (but continuing the flow). Note: All the processes shown are Executive processes and thus operate in the pipeline. An intermediate dataset and an additional input tool process is needed if a stand-alone process were included in this flow. A pipeline process must accept seismic traces from the Executive, process them, and return the processed data to the Executive.
1-12
Landmark
Disk Data Input, Tape Data Input and standalone tools always start new pipes within a single ow
CDP Stack
Bandpass Filter
Disk Data Output One pipe must complete successfully before a new pipe will start processing
Landmark
1-13
Types of Executive Processes The table below describes the four types of processes defined for use in the Executive. Table 1: ProMAX Executive Process Types
Process Type simple tools ensemble tools complex tools Description Accepts and returns a single seismic trace. Accepts and returns a gather of seismic traces Accepts and returns a variable number of seismic traces such as, stack. This type of process actually controls the ow of seismic data. Accepts and returns overlapping panels of traces to accommodate a group of traces too large to t into memory. Overlapping panels are processed and then merged along their seams.
panel tools
Stand-Alone Processes and Socket Tools Some seismic processing tools are not well suited to a pipeline architecture. Typically, these are tools making multiple passes through the data or requiring self-directed input. These tools can be run inline in a ProMAX job flow and appear as ordinary tools, but in reality are launched as separate processes. The current version of ProMAX does not provide the ability to output datasets from a stand-alone process. InterProcess Communication tools start a new process and then communicates with the Executive via UNIX interprocess communications. InterProcess Communication tools have the singular advantage of being able to accept and output traces in an asynchronous manner.
1-14
Landmark
This section discusses the following issues relating to the Ordered Parameter Files database: Organization Database Structure File Naming Conventions
The Ordered Parameter Files database serves as a central repository of information that you or the various tools can rapidly access. Collectively, the ordered database files store large classes of data, including acquisition parameters, geometry, statics and other surface consistent information, and pointers between the source, receiver and CDP domains. The design of the Orders is tailored for seismic data, and provides a compact format without duplication of information. The Ordered Parameter Files database is primarily used to obtain a list of traces to process, such as traces for a shot or CDP. This list of traces is then used to locate the index to actual trace data and headers in the MAP file of the dataset. Once determined, the index is used to extract the trace and trace header data from their files.
Organization The Ordered Parameter Files contain information applying to a line and its datasets. For this reason, there can be many datasets for a single set of Ordered Database Files. Ordered Parameter Files, unique to a line, reside in the Area/Line subdirectory. The Ordered Parameter Files database stores information in structured categories, known as Orders, representing unique sets of information. In each Order, there are N slots available for storage of information, where N is the number of elements in the order, such as the number of sources, number of surface locations, or number of CDPs. Each slot contains various attributes in various formats for one
Landmark
1-15
particular element of the Order. The Orders are organized as shown in the table below. Table 2: Organization of Ordered Parameter Files
LIN (Line) Contains constant line information, such as nal datum, type of units, source type, total number of shots. Contains information varying by trace, such as FB Picks, trim statics, source-receiver offsets. Contains information varying by surface receiver location, such as surface location x,y coordinates, surface location elevations, surface location statics, number of traces received at each surface location, and receiver fold. Contains information varying by source point, such as source x,y coordinates, source elevations, source uphole times, nearest surface location to source, source statics. Contains information varying by CDP location, such as CDP x,y coordinates, CDP elevation, CDP fold, nearest surface location. Contains information varying by channel number, such as Channel gain constants, channel statics Contains information varying by offset bin number, such as surface consistent amplitude analysis. OFB is created when certain processes are run, such as surface consistent amplitude analysis. Contains information describing the recording patterns.
TRC (Trace)
PAT (Pattern)
XLN (Crossline)
OPF Matrices The OPF database files can be considered to be matrices. Each OPF is indexed against the OPF counter and there are various single numbers per index. Note the relative size of the TRC OPF to the other OPF files. The TRC is by far the largest contributor to the size of the database on disk
1-16
Landmark
OPF Matrices
Landmark
1-17
Database Structure The ProMAX database was restructured for the 6.0 release to handle large 3D land and marine surveys. The features of the new database structure are listed below: Each order is contained within a subdirectory under Area and Line. For example, the TRC is in the subdirectory OPF.TRC. There are two types of files contained in the OPF subdirectories: Parameter: Contain attribute values. There may be any number of attribute les associated with an OPF. Index: Holds the list of parameters and their formats. There is only one index le in each OPF subdirectory. The exception to this is the LIN OPF. The LIN information is managed by just two les, one index and one parameter, named LIN.NDX and LIN.REC.
OPF files are of two types: Span: These les are denoted by the prex, #s. Non-span les lack this prex. The TRC, CDP, SIN, and SRF OPF parameters are span les. The rst span of 10MB for each parameter le is always written to primary storage. Span les are created in the secondary storage partitions listed in the cong_le as denoted with the OPF keyword. Span les may be moved to any disk partition within the secondary storage list for read purposes. Newly created spans are written in the OPF denoted secondary storage partitions. All subsequent spans are written to the secondary storage partitions denoted by the OPF keyword in a round robin fashion until the secondary storage is full. Then, subsequent spans are created in primary storage. Span le size is currently xed at 10 megabytes, or approximately 2.5 million 4 byte values per span le. Non-span: All other OPFs are non-span.
Given the fact that each parameter is managed by a file, it may be necessary to increase the maximum number of files open limit on some systems, specifically, SUN, Solaris and SGI. From the csh, the following command increases the file limit to 255 files open, limit de 255. The geometry spreadsheet is a ProMAX database editor. Modifying information within a spreadsheet editor and saving the changes will automatically update the database. There is no longer an import or
1-18
Landmark
export from the geometry database to the ProMAX database files as was required prior to the 6.0 release. Database append is allowed. Data can be added to the database via the OPF Extract tool or the geometry spreadsheet. This allows for the database to be constructed incrementally as the data arrives. There is improved network access to the database. Database I/O across the network is optimized to an NFS default packet size of 4K. All database reads and writes are in 4K pages. Existing and restored 5.X databases are automatically converted to the 6.0 (and later) database format.
File Naming Conventions Parameter file names consist of information type and parameter name, preceded by a prefix denoting the Order of the parameter. For example, the x coordinate for a shot in the SIN has the following name: #s0_OPF60_SIN.GEOMETRY.X_COORD. Where #s0_OPF60 indicates a first span file for the parameter, _SIN denotes the Order, GEOMETRY describes the information type of the parameter, and X_COORD is the parameter name. Index file names contain the three letter Order name. For example, the index file for the TRC is called OPF60_TRC.
NOTE: The index le for each Order must remain in the primary storage partition. Span parameter les may be moved and distributed anywhere within primary and secondary storage.
Within each Order, there are often multiple attributes, with each attribute being given a unique name.
Landmark
1-19
Parameter Tables
Parameter Tables are files used to store lists of information in a very generalized structure. To increase access speed and reduce storage requirements, parameter tables are stored in binary format. They are stored in the Area/Line subdirectory along with seismic datasets, the Ordered Parameter Files database files (those not in separate directories), and Flow subdirectories. Parameter Tables are often referred to as part of the database. Parameter tables differ from the OPF database in OPF files contain many attributes that are 1 number per something. Parameter tables contain more than one number per something. For example a velocity function contains multiple velocity-time pairs at one CDP.
Creating a Parameter Table Parameter tables are typically created in three ways: Processes store parameters to a table for later use by other processes. Parameter tables can be imported from ASCII les that were created by other software packages or hand-edited by you. Parameter tables can be created by hand using the Parameter Table Editor which is opened by the Create option on the parameter table list screen.
An example is the interactive picking of time gates within the Trace Display process. After seismic data is displayed on the screen, you pull down the Picking Menu and choose the type of table to create. The end result of your work is a parameter table. If you were to pick a top mute, you would generate a parameter table ending in TMUT. If you were picking a time horizon, you would generate a table ending in THOR. These picks are stored in tabular format, where they can be edited, used
1-20
Landmark
by other processes in later processing, or exported to ASCII files for use by other software packages.s
WARNING: Remember, you name and store the parameter tables in their specic Area/Line subdirectory. Therefore, you can inadvertently overwrite an existing parameter table by editing a parameter table in a different processing ow.
ASCII Import to a Parameter Table File Import reads either ASCII or EBCDIC formatted files with fixed columnar data into the spreadsheet editor. When the application is initialized, two windows appear: the main ASCII/EBCDIC File Import window and the File Import Selection dialog. After a file has been selected, it is displayed, and you can select rows. Note: Filter and Apply appear grayed out and are insensitive to mouse button actions. After Format has been pressed and a columnar format selected, Filter and Apply appear normally and are available for use. The steps involved in performing a file import are as follows: 1. Select File: Select a le to import. If the text le does not contain valid line terminators, use Width to set the line width and then reread the le. 1. Select Format: Select a previous format or create a new format. 1. Review or Edit Column Denitions: Review the previously dened columns in an existing format by selecting all the columns. Review the highlighted regions in the le display for accuracy. Columns can either be edited by hand entering Start Col. and End Col. values, or by performing click and drag column denition. 1. Save the Column Denition: Save any changes to the current column denition to disk for later retrieval. 1. Filter the File for Invalid Text: Search the marked columns and rows for any invalid text. Text may be excluded or replaced within this interactive operation. 1. Perform the Import: Select the Apply button. The application windows will close and the focus will return to the calling spreadsheet.
Landmark
1-21
ASCII File Export from the Parameter Table Editor Export writes either ASCII or EBCDIC formatted files with fixed columnar data from a spreadsheet editor. When the application is initialized, the main ASCII File Export window will appear. After a file and format has been selected, then the ASCII text is displayed and the Apply button is activated. The steps involved in performing a file export are as follows: 1. Select File: Select a le for export within the File Export Selection dialog. 1. Select Format: Select a previous format or create a new format. 1. Review or Edit Column Denitions: Review the previously dened columns in an existing format by selecting all the columns. Review the highlighted regions in the le display for accuracy. Columns can either be edited by hand entering Start Col. and End Col. values, or by performing click and drag column denition. 1. Save the Column Denition: Save any changes to the current column denition to disk for later retrieval. 1. Perform the Export: Select the Apply button. 1. Cancel the Export Operation: Press the Cancel button to close the export windows and return to the calling spreadsheet.
1-22
Landmark
Disk Datasets
ProMAX uses a proprietary disk dataset format that is tailored for interactive processing and random disk access. Disk dataset files can span multiple filesystems, allowing for unlimited filesize datasets. A typical set of files might look like this: $PROMAX_HOME/data/usertutorials/landexample/ 12345678CIND $PROMAX_HOME/data/usertutorials/landexample/ 12345678CMAP $PROMAX_HOME/data/usertutorials/landexample/12345678/ TRC1 $PROMAX_HOME/data/usertutorials/landexample/12345678/ HDR1 These files are described in more detail in the table below. Table 4: Composition of a Seismic Dataset
File Name Trace (...TRCx) Trace Header (....HDRx)
Contents
File containing actual sample values for data trace.
File containing trace header entries corresponding to data samples for traces in the trace le. This le may vary in length, growing as new header entries are added. Keep trace headers in a separate le so trace headers can be sorted without needing to skip past the seismic data samples. File keeps track of trace locations, even if data ows over many disks. Given a particular trace number, it will nd the sequential trace number within the dataset. This rapidly accesses traces during processing. The map le is a separate le, as it may grow during processing, it is always held in the line directory. File has free-form format information relating to the entire dataset, including sample interval, number of samples per trace, processing history, and names of trace header entries. This le may grow during processing, and it is also always held in the line directory.
Map (....CMAP)
Index (....CIND)
Landmark
1-23
CIND
HDRx
CMAP
TRCx
Secondary Storage In a default ProMAX configuration, all seismic dataset files reside on a single disk partition. The location of this disk partition is set in the $PROMAX_HOME/etc/config_file with the entry: primary disk storage partition: $PROMAX_HOME/promax/ data 200 In addition to the actual trace data files, the primary storage partition will always contain your flow subdirectories, parameter tables, ordered parameter files, and various miscellaneous files. The ...CIND and ...CMAP files which comprise an integral part of any seismic dataset are always written to primary storage. Since the primary storage file system is of finite size, ProMAX provides the capability to have some of the disk datasets, such as the ...TRCx and ...HDRx files, and some of the ordered parameter files span multiple disk partitions. Disk partitions other than the primary disk storage partition are referred to as secondary storage. All secondary storage disk partitions must be declared in the appropriate $PROMAX_HOME/etc/config_file. Samples entries are:
1-24
Landmark
secondary disk storage partition: $PROMAX_HOME/data2 20 TRC OPF secondary disk storage partition: $PROMAX_HOME/data3 20 TRC secondary disk storage partition: $PROMAX_HOME/data4 20 OPF secondary disk storage partition: $PROMAX_HOME/data5 20 Primary Data Secondary Data2 Secondary Data3 Secondary Data4 Secondary Data5
Refer to the ProMAX System Administration guide for a complete description of the config_file entries for primary and secondary disk storage. 20 is the default disk file size in Megabytes. This default is probably too small for modern surveys as it was based on the old Unix 2Gig file system limitation. A better value would be 4000, or as large as your dataset, or as large as a file as your system will allow.
WARNING: If the Primary le system lls up ProMAX will crash and will not be able to launch until space on Primary has been cleaned up.
Under the default configuration, the initial TRC1 and HDR1 files are written to the primary storage partition. It is possible to override this behavior by setting the appropriate parameter in Disk Data Output. If the parameter Skip primary disk partition? is set to Yes, then no TRC or HDR files will be written to the primary disk partition. This can be useful as a means of maintaining space on the primary storage partition. (To make this the default situation for all users, have your ProMAX system administrator edit the diskwrite.menu file, setting the value for Alstore to t instead of nil). Secondary storage is used in a as listed and available fashion. As an attempt to minimize data loss due to disk hardware failure, ProMAX tries to write a dataset to as few physical disks as possible. If the primary storage partition is skipped by setting the appropriate parameter in Disk Data Output, the CIND and CMAP files are still written to the primary storage partition, but the TRCx or HDRx files will not be found there.
Landmark
1-25
Tape Datasets
Tape datasets are stored in a proprietary format, similar to the disk dataset format, but incorporating required structures for tape input and output. Tape input/output operates either in conjunction with a tape catalog system, or without reference to the tape catalog. The tape devices used for the Tape Data Input, Tape Data Insert, and Tape Data Output processes are declared in the ProMAX device configuration window. This allows access to tape drives anywhere on a network. The machines that the tape drives are attached to do not need to be licensed for ProMAX, but the fclient.exe program must be installed.
Tape Trace Datasets A ProMAX tape dataset is similar to a disk dataset in that the index file (...CIND) and map file (...CMAP) still reside on disk in the Line/survey database. Refer to the documentation in the Disk Datasets portion of this helpfile for a discussion of these files. Having the index and map files available on disk provides you with immediate access to information about the dataset, without needing to access any tapes. It also provides all the information necessary to access traces in a non-sequential manner. Although the index and map files still reside on disk, copies of them are also placed on tape(s), so that the tape(s) can serve as a self-contained unit(s). If the index and map files are removed from disk, or never existed, as in the case where a dataset is shipped to another site, the tapes can be read without them. However, access to datasets through the index and map files residing solely on tape must be purely sequential. Tape datasets are written by the Tape Data Output process, and can be read using the Tape Data Input or Tape Data Insert processes. These input processes include the capability to input tapes by reel, ensemble number, or trace number. Refer to the relevant helpfile for a complete description of the parameters used in these processes. The use or non-use of the tape catalog in conjunction with the tape I/O processes is determined by the tape catalog type entry in the appropriate $PROMAX_HOME/etc/config_file. Setting this variable to full activates catalog access, while an entry of none deactivates catalog access. An entry of external is used to indicate that an external tape catalog, such as the Cray Reel Librarian, will be used. You can override the setting provided in the config_file by setting the environment
1-26
Landmark
variable for BYPASS_CATALOG to t, in which case the catalog will not be used. The actual tape devices to use for tape I/O must also appear as entries in the config_file, under the tape device: stanza.
Landmark
1-27
Getting Started The first step in using the ProMAX tape catalog is to create some labeled tapes. The program $PROMAX_HOME/sys/bin/tcat is used for tape labelling, catalog creation and maintenance, and for listing current catalog information. The program is run from the UNIX command line. The following steps are required to successfully access the tape catalog: 1. Label tapes 1. Read and Display tape labels 1. Add labeled tapes to a totally new catalog Before adding the tapes to a new catalog, it is a good idea to visually inspect the contents of the label information file for duplicate or missing entries. The contents typically look like: 0 AAAAAA 0 1 4 1 AAAAAB 0 1 4
1-28
Landmark
2 AAAAAC 0 1 4 3 AAAAAD 0 1 4 4 AAAAAE 0 1 4 The fields are: volume serial number (digital form), volume serial number (character form), tape rack slot number, site number, and media type, respectively. You can manually edit these fields. 1. Write a label information le from the existing catalog. 1. Add labeled tapes (and datasets) to the existing catalog. 1. Merge an additional catalog into the existing catalog. 1. Delete a dataset from the catalog.
Landmark
1-29
1-30
Landmark
Chapter 2
ProMAX 3D Geometry
Geometry Assignment is designed to create the standard Ordered Parameter File directories, OPFs, and load standard ProMAX geometry information into the trace headers. The sequence of steps depends upon available information. This chapter serves as an introduction to different approaches of geometry assignment. The Geometry Overview section in the Reference Manual and online helple provide further details of the geometry assignment process. In this chapter we will present one of the three different methods for building a ProMAX database. This method is referred to as the From Survey method where the database is built from observers log and ASCII survey le information. After the database is complete the geometry information is copied to the trace headers of the input seismic data on the rst read from the eld tapes.
Landmark
2-1
SEG-? Input
Disk Data Output Inline Geom Header Load Valid Trace Numbers Overwrite Trace Headers Seismic Data (ProMAX) Seismic Data (ProMAX)
2-2
Landmark
In the following example we will read the data directly from the SEG-Y version of the field tapes and use the header words FFID and Recording Channel number to match each trace with its corresponding information in the database.
Landmark
2-3
2-4
Landmark
Observers Report
Source Interval=N/A (random) Source Station Numbers 1001 - 1016 1017 - 1024 1025 - 1031 1032 - 1043 1044 - 1053 1054 - 1061 1062 - 1070 1071 - 1075 1076 1077 - 1081 1082 1083 - 1085 1 - 16 17 - 24 25 - 31 32 - 43 44 - 53 54 -61 62 - 70 71 - 75 76 77 - 81 82 83 - 85 Field File IDs
Sample Interval= 4 ms Receiver Station at Chan 1 1 73 145 217 289 361 433 505 505 505 505 505
Record Length= 2.0 sec. Receiver Station at Last Chan 240 312 384 456 528 600 672 720 624 720 624 720
240 240 240 240 240 240 240 216 120 216 120 216
Shot interval
N/A --- Shots are positioned randomly
Azimuth
5.6 degrees East of North
CDP Spacing
55 ft. inline by 55 ft. crossline
Landmark
2-5
SEGY Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ------------------------------------------------/misc_les/3d/manhattan3d_segy_disk MAX traces per ensemble: ------------------------------------240 Remap SEGY header values?: --------------------------------No
Trace Display
---- Use All Default Parameters ---3. In SEGY Input, enter your input shot dataset as described by your instructor. 4. Use all default parameters for the Trace Display. 5. Execute the ow.
2-6
Landmark
Of these options, the last two are used most often, unless you have some pre-computed information.
Landmark
2-7
3D Land Geometry Spreadsheet 1. Add a line to your area called database from survey/obs logs. 2. Build the following ow:
The setup menu pictured on the following page allows you to define global information applying to the configuration and operation of the Geometry Spreadsheet. 5. Select the Assign Midpoints Method: Matching pattern numbers in the SIN and PAT spreadsheet which is the default. This option allows you to build geometry by on dening patterns of receivers and then declaring which pattern is used with each source. 6. Select 110 for the nominal receiver station interval and crossline separation and do not ll out the source station interval. These numbers are only used for QC purposes. 7. Do not ll in the nominal survey azimuth. We will interactively determine these values and will enter them later.
2-8
Landmark
Setup Window
8. Answer No to Base Source station coordinates upon a match between source and receiver station numbers. In our case the sources are numbered 1001-1085 and the receivers are 1-720. There is no relationship. 9. Enter shot holes for source type. 10. Set units to Feet. 11. Do not specify a coordinate origin. 12. Specify the font assignment if you wish to change the display. 13. Click on the OK button.
Landmark ProMAX 3D Seismic Processing and Analysis 2-9
Receivers Spreadsheet 1. Click on Receivers in the main Spreadsheet window to bring up the Receivers spreadsheet. 2. Select File Import from the pull down menus on the spreadsheet to read the contents of an ASCII le into the spreadsheet. When working with ASCII le import there are three required steps: Identify the ASCII le Dene which numbers are in which columns, and Dene which cards or rows to exclude from the import.
3. Enter the directory path that contains the desired dataset followed by a /* as directed by your instructor. Double click the le segp1.3d.rec or select the le and then click OK. 4. Click Format and enter a name for a format description containing ASCII import column denition information. For Example: Manhattan receiver format 5. In the Column Import Denition Menu, click on the parameter attribute name, such as station, and dene the column information Note that the selection turns white.
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they reect the MB1 press and drag operation for column denition
6. Highlight the columns that contain the numbers for the attribute you selected while holding down MB1 and moving from left to right in the le import window. 7. Repeat the previous two steps for the X and Y coordinates and the elevations. Switch to card or row exclusion mode. 8. With the cursor positioned over the Parameter Column notice that the MB3 help will toggle the Column Denition Off.
2-10
Landmark
9. Click MB3 with the cursor positioned over the word Station or one of the other columnar attributes.. Note: MB3 turns column denition off when the cursor is in the attribute list column
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they now reect block selection and deletion options.
10. Use MB1 and MB2 to dene title rows, blank rows, and rows with information that you do not want to import, and press Ctrl-d. This writes a Ignore record for import message on all the dened rows. 11. From the main import menu, select Filter. This will check for any cards with inappropriate information, and allows you to interactively delete them. 12. From the main import menu, select Apply.
Landmark
2-11
13. Select Overwrite ALL existing values with new import values and click OK. This removes your import menus and the receiver spreadsheet should be filled out with the receiver stations numbers, X and Y coordinates and elevations.
Receivers Spreadsheet 14. Make sure you have 720 stations dened in your receiver spreadsheet, no blank lines at the end, and that the information looks correct. 15. Use the File Exit pulldown menu to save the spreadsheet information and exit.
2-12
Landmark
Sources Spread Sheet 1. Click on Sources to bring up the Sources Spreadsheet. 2. Select File Import from the pull down menus on the spreadsheet to read the contents of an ASCII le into the spreadsheet. 3. Enter the directory path that contains the desired dataset followed by a /* as directed by your instructor. and select the le segp1.3d.sou. Double click the le name or select the le and then click OK. 4. Click Format and enter a name for a format description containing ASCII import column denition information. In the Column Import Denition menu, click on the parameter attribute name, such as station, and dene the column information Note that the selection turns white.
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they reect the MB1 press and drag operation for column denition
5. Highlight the columns that contain the numbers for the attribute you selected while holding down MB 1 and moving from left to right. 6. Repeat the previous two steps for all the attributes you want to import from your ASCII le. Make sure that you import the STATION column and in addition read the X and Y coordinates, the elevations, uphole times and shot depths. Switch to card or row exclusion mode. 7. With the cursor positioned over the Parameter Column notice that the MB 3 help will toggle the Column Denition Off.
Landmark
2-13
8. Click MB3 with the cursor positioned over the word Station or one of the other columnar attributes.
Note: MB3 turns column denition off when the cursor is in the Parameter list column
NOTE: Look at the Mouse Button help descriptions at the bottom of the ASCII text window. Note that they now reect block selection and deletion options.
9. Use MB1 and MB2 to dene title rows, blank rows, and rows with information that you do not want to import, and press Ctrl-d.
2-14
Landmark
This writes a Ignore record for import message on all the dened rows. 10. From the main import menu, select Filter. This will check for any cards with inappropriate information, and allows you to interactively delete them. 11. From the main import menu, select Apply. 12. Select Overwrite ALL existing values with new import values and click OK. This removes your import menus and the sources spread sheet should be lled out with the receiver X and Y coordinates, elevations, shot depths and uphole times.
13. Make sure 85 shots are dened in your sources spreadsheet, and the information looks correct. 14. Use the File Save pulldown menu to save the information in the sources spreadsheet. You are not nished with the shot information yet. We will return to the sources spreadsheet after completing the pattern specication.
Landmark
2-15
Basemap QC and setting the Prospect Level Azimuth 1. Open the receivers spreadsheet again by clicking on Receivers in the main spreadsheet menu. 2. QC the survey information by selecting View View All Basemap from the Receivers Spreadsheet.
Highlight Contributors to cross domain ICON Also measures Distance and Azimuth
Cross Domain QC 3. Using the Cross Domain Icon (Double Fold Icon) use the MB3 option to measure the azimuth along a cable line. You should nd a value somewhere between 5.5 and 6 degrees East of North. 4. Reopen the Setup window from the main Geometry window.
2-16
Landmark
Enter this angle (5.6 degrees) for Nominal Survey Azimuth. 5. Close the Setup window. 6. Overlay a color contour graph of the source and receiver elevation on the top of the Basemap. To create this plot, click on Display Recs&Sources Color Contour Elevation. This overlays a colored elevation contour on top of the basemap. 7. Select the Views Remove Shot and Receiver based Field of Elevation option at the top of the color contour display to remove the shot and receiver elevation display. You can remove attributes or place different attributes to the top, using options under the Views menu. 8. Open the Sources spreadsheet 9. Click on the Report icon on the XYGraph Display. Click MB1 on any shot location on the basemap. This takes you to that shot location in the source spreadsheet. Since you generated the basemap from the Receivers spreadsheet, the shots are overlaid and dominant. Therefore, the Sources spreadsheet must be open.
Note:
If you generated the basemap from the sources spreadsheet, then the receivers are overlaid and dominant and the map is expecting to talk to the Receiver spreadsheet. This spreadsheet must be open for the report to respond to the spreadsheet. You can control which map is overlaid using the Views Transparent pulldown menus. Select the map that you want to be on top of the others by clicking MB1 on the map name. 10. Exit the XYGraph and the Receiver spreadsheet using the File Exit Conrm and File Exit pull down menus.
Landmark ProMAX 3D Seismic Processing and Analysis 2-17
Patterns Spreadsheet 1. From the main Geometry menu, select Patterns to ll out receiver pattern information. Two windows will appear.
2. In the small window enter 240 for the maximum number of channels per shot, and select Varying number of channels/record. 3. Fill out the Pattern Spreadsheet specifying one pattern varying from channel 1 to channel 240 by one and from receivers 1 to 240 by one. Leave the Receiver Line number blank in this case.
Multiple patterns would be required if the gap changes in size or location, relative to the channel numbers. Multiple patterns would also be required in the case where multiple cable lines are used and the receiver line numbers change for different groups of shots. 4. Click on File Exit from the Patterns Spreadsheet. If the pattern was mistyped, the error column will have stars in it and the pattern spreadsheet will not exit. Highlight the error column and look at the information at the bottom of the spreadsheet for a description of the problem. Fix the problem, and attempt to exit again.
2-18
Landmark
Complete the Sources Spreadsheet 1. Fill out the FFID, Pattern, Pat Num Chn, and Pat Shift columns. The FFIDs start at 1 and increment by 1 for a total of 85 FFIDs. All of the shots use the same pattern which denes continuously numbered receivers; therefore, we can ll the Pattern column with 1s. Num Chn is the number of channels per pattern. Use the observers log to get the number of channels per shot. Pat Shift is the shift, in the receiver station number of the rst channel relative to that entered in the Patterns Spreadsheet. Use the observers log to calculate these values. These will not be the rst receiver station numbers, but will be the receiver number -1. Do not ll in the Shot Fold* column. These values will be automatically calculated when you assign midpoints.
2. In the source spreadsheet main menu, select File Exit to save and exit the Sources spreadsheet.
Landmark
2-19
CDP Binning This exercise illustrates the CDP binning procedures. 1. In the main Geometry menu, click on Bin. A submenu appears with options for Assigning the traces to midpoints, dening the bin grid, binning the data, quality controlling the binning, and nalizing the database.
2. Select Assign midpoints by: Matching pattern number in the SIN and PAT spreadsheets, and click on Ok. In this case the Assignment step is performing the following calculations: Computes the SRF number for each Trace in the TRC database. Computes the Shot to Receiver Offset (Distance). Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
2-20
Landmark
3. An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click on Proceed.
4. A number of progress windows will ash on the screen as this step runs. A nal Status window should notify that you successfully completed geometry assignment. Click on OK. If this step fails, you have an error in your spreadsheets somewhere. Not much help is given to you, but the problems are usually related to the pattern denitions and spread layout as dened in the Sources Spreadsheet. 5. Reopen the Sources spreadsheet and check to make sure your Num Chn column values and the Shot Fold* column values match. If these columns do not match, check the Pat Shift values and rerun Assign midpoints in the Bin window.
Interactive Spread QC using XYgraph A useful QC to perform at this point is to redisplay a Basemap and use the Cross Domain Icon to view the receivers that have been defined as live for each shot. 1. Open the Receivers spreadsheet again by clicking Receivers in the main Geometry menu. 2. Generate a Basemap by selecting View View All Basemap from the Receiver spreadsheet pull down menus.
Landmark
2-21
3. Using the Cross Domain Icon (Double Fold Icon) use the Mouse Button 1 and 2 options to see which receivers have been dened to be live for each shot and which shots contribute to each receiver. Press and hold down MB1 on a shot location and continue to hold the button as you move from one shot the another.
2-22
Landmark
Dening the CDP binning grid There are three ways to define the CDP bin grid parameters: Manually compute all of the required information Interactively dene a proposed CDP binning grid, or Automatically compute a CDP bin grid based on an azimuth and bin sizes.
Interactive Grid Denition For this example we will look at the second option and interactively define a grid of CDP bins. 1. Select Dene binning grid from the 3D Binning and QC window and click on Ok.
This will bring up a small map window. 2. Select Display Midpoint Control Points Black (depending on the color of the background).
Landmark
2-23
2-24
Landmark
3. Click on Grid Display. This step overlays a bin grid on your subsurface data. The default is a 10 cell by 10 cell grid which is 100 by 100 and oriented at an azimuth of 0o. 4. Click on Grid Parameterize, and change the cell size along and across Azimuth to 55 by 55. You may also enter an azimuth of 6o, determined from the receivers. Then click on the Green Trafc light icon. This interactively alters the bin grid size. You can completely dene your bin grid with the Parameterize menu. Now we can use the icons on the side of the window to orient the grid to best t the subsurface data. First, read the following short description of each of the icons.
Landmark
2-25
Size Grid Cells: This icon adjusts the grid cell dimensions by either expanding or contracting the cells along one axis or the other. MB1 and MB3 will expand or contract the cells respectively. Press and hold either button near an edge of a cell on the outside of the grid, and then move the cursor. The cells will change size with the mouse movement. Release the button when the desired size has been reached. Click and hold MB2 to grab the nearest edge and move it and the cells will adjust size to meet the new location of the edge. Release MB2 when the desired size has been found. Add/Delete Grid Cells: This icon adds or deletes rows or columns of cells from the grid. MB1 and MB3 will add or delete cells. Click either button near an outside edge of the grid and a group of cells will be added or deleted. Press and hold MB2 to set the edge of the grid to the location of the cursor. Release the button when the number of cells is as desired. Spider: This icon displays selected bins in one of two spider plot formats. Move the cursor to the desired bin for the spider plot and click either MB1 or MB3. MB1 generates a spider plot consisting of line segments drawn from the bin center to the midpoint locations within the bin. MB3 generates a source receiver spider plot displaying the delta x and y values for the source receiver pairs. Multiple bins may be selected with spider plots being generated for each. The spider plot will be updated to reect any changes in bin size and orientation when the grid position or rotation is modied.
Note:
The performance of these operations is dependent upon the number of spider plots currently active.
6. Use the Rotate icon to rotate your grid to match the azimuth of the survey. A value of 5.6o is appropriate. 7. Use the Move icon to move your grid to align the lower left hand corner of the survey.
2-26
Landmark
Check any subsurface areas with questionable binning by using the Zoom icon. At this point, the midpoints should be in the center of the bins. 8. Select Grid Drawing to check how well the midpoints are centered in your grid. To adjust the grid, use the Move icon. This will draw an axis through the center of the grids as opposed to drawing the grid/bin edges. The midpoint clusters appear at the grid cross points (which are the bin centers). 9. Click on Grid Drawing again to return to your original grid boxes. 10. Check CDP smear by using the Spider Plot icon. This shows where your individual midpoints are in relationship to the bin center. As you move the grid, the spider plot changes to the selected bin. Ideally, you would want to minimize the scatter within the bin. 11. Reuse any of the icons to optimally overlay the bin grid on the subsurface data. 12. Check the size of your grid by clicking on Grid Parameterize. In order to be consistent with later instructions, please make sure you have a grid that is 42 by 79 (nx by ny) at an azimuth of 5.6 degrees. 13. Click on Grid Save when you are satised with your grid. You are prompted to enter a name for your grid. A suggestion would be to name this grid CDP Grid - 55x55 ft - 5.6 degrees - origin in SW corner)
Note:
If you do not save your grid before leaving this plot, you will loose your work. 14. Exit from the XYGraph using File Exit Conrm and return to the 3D Binning and QC submenu.
Landmark
2-27
CDP Bin Application 15. Select Bin midpoints and click Ok. The following window with default parameters will appear:
16 17 19 20
18
16. Click Load, and select the bin grid that you just saved. The parameters describing your grid will be loaded into each box. 17. Select Inlines parallel to grid Y axis, which is parallel to the dened azimuth. In our case, this is parallel to the cables. 18. Change the offset binning parameters to be 0- 5000 with and increment of 220. The selection of 220 ft per offset bin is relatively arbitrary here. Normally the offset binning will be some multiple of the group interval. 19. Save the grid denition by clicking on Save. 20. Click on the Apply button the binning denition to assign each trace to a single CDP as dened by the bin denition.
2-28
Landmark
21. Several status windows will appear on your screen. Click Ok in the nal window. 22. Click on Cancel to exit the binning window. QC Binning with Fold Plot 23. Select QC Bin data..
24
24 25 24. Click the QC Bin Space button and select your bin grid from the list. 25. Click Ok to bring up the Coordinate Space Fold display: This option displays a map of subsurface coverage versus x,y. The maximum fold value should be 16. 26. Select Finalize Database, and click Ok.
27. After the nalization is complete, select Cancel to close 3D Binning and QC window.
Landmark
2-29
28. Select File Exit from the main spreadsheet menu to exit the Geometry Spreadsheet. You now have a fully populated database ready to be loaded to the trace headers.
2-30
Landmark
DBTools; View Predened Receiver fold map used to check for variations in receiver multiplicity
DBTools; View Predened Source fold map used to check for variations in number of channels per source
DBTools; View Predened SIN-SRF-Offset used to check the live receivers for each shot
DBTools; View Predened ILN-XLN-CDP used to map 3D CDP numbers to inline and crossline coordinates
DBTools; View Predened Offset-CDP-SIN used to check offset distribution in CDPs for velocity analysis placement and DMO binning
Landmark
2-31
2D plots of SIN vs. UPHOLE, DEPTH, NCHANS and SRF vs. FOLD used to check various attributes for sources and receivers
2D plots of ILN and XLN vs. FOLD used to nd minimum and maximum live inline and crossline numbers after binning
2-32
Landmark
In general, for 3D processing, we recommend to first build the geometry database and then assign its contents to the traces on the first read from field tapes. In this way, you only have to read the data twice: once to count the traces and once to read the headers.
Landmark
2-33
1. Copy your ow 01 - View SEGY gathers, and edit it to look like the following:
SEGY Input
----Use the same parameters as before----
2-34
Landmark
Graphical Geometry QC
After loading the geometry from the database to the trace headers, you can use the Graphical Geometry QC process to look at the shot records with linear moveout applied at the assigned offsets. The traces will line up vertically as a function of their assigned surface receiver position. On this display you will want to check all of the shots, making sure first arrivals appear at approximately the same time. You can also determine if traces were assigned to the correct receivers by following traces vertically. If reversed polarity receivers exist, then all traces recorded at that receiver will be reversed and should align themselves on the same trace.
Landmark
2-35
OPTIONS * Pre-Initialization no yes * Full Extraction no no yes * From Field Notes and Survey no yes
QUESTIONS
* Does Shot and Receiver X, Y, and station information exist in the headers and do you want to use it?
* Do you want to minimize the number of times that you have to read the data?
Table Diagram
Question Is shot and receiver station, and x,y information in the headers; do you want to use it? Do you want to minimize the number of times to read the data? Answer Yes Option Full Extraction
No Yes
No
Partial Extraction
2-36
Landmark
Transferring the Database to Trace Headers When the database is completed, the information contained in it is transferred to trace headers. The following question determines how to match a trace in the data file to a trace in the database:
Question Was a Full or Partial Extraction used to create the database and a new output le written? Answer No Option Inline Geom Header Load by Chan and other trace header words.
Yes
Landmark
2-37
Inline Geom Header Load is the main program used to assign geometry values to individual trace headers from the OPF database files. One of the main issues related to this geometry assignment procedure is to define how a trace in a data file will be identified in the Trace Ordered Parameter file. One of the options is to use a specific trace header word called the valid trace number. In order to utilize the valid trace number we will have to spend some time discussing its origin and how it can be used. Another program that may be used in the geometry assignment procedure is called Extract Database Files. We will see that this program is one of the ways that the valid trace number can be generated by running it in either the Partial or Full extraction modes. Geometry Header Preparation is another program that may be selected in the geometry assignment procedures. This program can be used for a variety of different purposes. We will look specifically at how it can be used when dealing with the problem of duplicate Field File Identification Numbers.
Steps Performed by Inline Geom Header Load Inline Geom Header Load is the program that populates the trace headers of an input data le with the geometry information stored in the database. The outcome from running this program is to have a database and a data le that match.
2-38
Landmark
This means that every trace in the output data le exists in the database and there is a one to one correspondence in all values in the trace header to those in the database. After a successful run each trace will also be assigned the valid trace number if it was not pre-assigned using Extract Database Files.
There are two major options in this program pertaining to how to identify a trace in the input data file with a trace in the database. These options are: 1. to read the valid trace number from the input trace header, or 2. to read the recording channel number (automatic) and 1 or 2 trace header words that can uniquely identify this trace as having originated from a unique shot (SIN) that exists in the shot database. Once a trace in a data file has been identified in the Trace OPF, the information in all of the OPFs for that trace is copied to the trace header.
Valid Trace Numbers Before we proceed, lets make sure that we understand the idea of the valid trace number. Understanding this will help us decide on the best course of action for our data. The valid trace number is simply a ProMAX trace header word. Every trace in the database is numbered from 1 to N, where N is the total number of individual traces in the database. This is a unique number for each trace in the line or 3D project. A valid trace number combined with matching geometry is a ag that will allow fast random access sorting of disk datasets. Every trace in the TRC database is assigned to a single SIN (shot), SRF (receiver) and CDP. Every trace has an individual Shot to Receiver Offset distance, an individual midpoint X and Y location and many other values that are single numbers, that may, or may not be different for every trace.
Landmark
2-39
Inline Geom Header Load matches the current trace being processed to the database and then copies all of the trace dependent values as well as the other order values to the trace header. The last thing that happens is that the traces are stamped as matching the database.
Valid Trace Number Origin Where does the valid trace number trace header word come from? Luckily, the answer to this is very simple. The Extract Database Files program writes this trace header word after it reads and counts a trace that it is entering into the TRC database. In this case the valid trace number is pre-assigned. If it is not pre-assigned, the Inline Geom Header Load process will create it after it determines which trace in the database corresponds to a trace in a data le.
The valid trace number is a unique number for every trace and is stored in the trace header as TRACE_NO. This trace header word continues to exist ONLY if you write a new trace file after the extraction procedure. A common question that arises concerns the decision to pre-assign the valid trace number using Extract Database Files or to rely on the alternate header identification on the first read of the input data. You may consider using Extract Database Files if there is sufficient information in the trace headers that can be transferred to the database which will save time and increase accuracy of the geometry definition process. The extraction may be run in either the partial extraction or full extraction modes depending on what information is available in the trace headers of the input data.
Steps Performed By Extraction The steps performed by the extraction options are: Pre-Geometry Initialization (or partial extraction) which is sometimes used when no receiver information exists in the incoming headers. Partial Extraction counts each of the following: the number of traces encountered
2-40
Landmark
the number of shots encountered the number of traces per shot and then writes the trace count number and SIN to the trace header
Full Extraction is used when you want to extract the shot and receiver location and coordinate information from the incoming headers. Full Extraction counts each of the following: the number of traces encountered the number of shots encountered the number of traces per shot the number of receivers encountered the number of traces per receiver and then writes the trace count number and SIN to the trace header
IF you have run the extraction in either mode, AND written a new trace data file, AND have not altered the number of traces in the database, you now have valid trace numbers in the headers of the output data set which you can use to map a trace in a data file to a trace in the database. This mapping will be performed by Inline Geom Header Load after the database is completed.
Between Extraction and Geom Load After running Extract Database Files in either mode there are many steps that need to be completed prior to running the inline Geom Header Load. The extraction only partially populates the database. More work will generally need to be done in the Spreadsheets to input the remaining information. After the Spreadsheets are complete, the next step would be to complete the CDP binning procedures and then finalize the database. With the database complete, you can continue with the next step of loading the geometry information from the databases to the trace headers. You may elect to address a trace by its valid trace number
Landmark
2-41
assigned during the extraction or you may read a combination of trace headers to identify the trace.
Geometry Load Procedures For the first option, Inline Geom Header Load operates as follows: 1) Identifies the TRACE_NO of the incoming trace and finds that trace in the TRC database. 2) Copies the appropriate TRC order values to the trace header and then 3) Finds the shot, receiver, cdp, inline, crossline, and offset bin for that trace. The appropriate values from those orders are then copied to the trace headers as well. In the second option, Inline Geom Header Load does not know exactly which TRACE_NO it is looking for. It does know which channel and shot to look for based on the header word(s) that you selected. Given that this mapping is unique, the program now knows which SIN and CHAN to look for in the TRC database. Once the entry is found, the TRACE_NO is copied to the headers and the steps outlined in the first option are performed. Again, the key to the second option is that you need to identify which shot a trace came from by a unique combination of header words for that shot.
2-42
Landmark
Pre-Initialization Path
O.B. Notes UKOOA ASCII Field Data SEG-? Input
Builds TRC and SIN OPFs only Pre Geom Init = yes
Landmark
2-43
This option may be appropriate for relatively small datasets which only have FFID and CHAN in the input trace headers. This option should be used when reading the field data and writing the data to disk for the first time. In so doing, information, such as FFID, number of shots, number of channels are written to the database, and are then available when the geometry is completed. Selecting this option will also stamp the output dataset with valid trace numbers, which allows you to process with trace headers only and overwrite the dataset with updated geometry from the database files. This is an important concept for the Inline Geom Header Load process. In the following example, you will assume that only the FFID and recording channel number exist in the incoming trace headers. This information will be extracted, using the perform pre-geometry database initialization option in Extract Database Files.
2-44
Landmark
Pre Geometry Initialization ow 1. Make a new line called from pre-initialization. 2. Build the following ow:
SEGY Input
Type of storage ------------------------------- Disk Image Enter DISK le path name ------------------------------------------/misc_les/3d/manhattan3d_segy_disk MAXIMUM traces per ensemble ------------------- 240 Remap SEGY header values ------------------------ NO
Landmark
2-45
Receivers: identify by STATIONS 5. In Extract Database Files, select Yes for the option Pre-geometry extraction. This initializes the SIN and TRC domains of the Ordered Parameter Files, stamps the dataset with valid trace numbers, and allows for the use of overwrite mode when performing the Inline Geom Header Load step later. 6. In Disk Data Output, enter the name for a new output le, such as raw shot data. 7. Execute the ow. After the flow completes: 8. Exit the ow building level and select Database from the global command line. 9. Check the OPFs, verifying the number of records in the dataset, the number of channels/record, and the FFID range. The only OPF les that should exist are LIN, SIN, and TRC. If SRF exists, this means that you identied traces for receivers by coordinates. You will also nd that the SRF OPF has 1 value in it.
Complete the Spreadsheet In this sequence, the next steps would be to complete the Sources, Receivers and Patterns Spreadsheets and perform the CDP binning similarly to the sequence used in the previous exercise.
2-46
Landmark
Load Geometry to Trace Headers 1. If the geometry in the database looks good, build the following ow:
Landmark
2-47
3. In Inline Geom Header Load, match the traces by their valid trace numbers. Since the traces were read and counted with Extract Database Files, you have a valid trace number to identify a trace. You have binned all traces; therefore, do not drop any traces. Unless you have a problem, there is no need for verbose diagnostics. 4. In Disk Data Output, output to the same dataset as specied in Disk Data Input. We will use the overwrite option in conjunction with trace header only processing in the Disk Data Input. 5. Execute this ow. In the Extract Database Files path, the Inline Geom Header Load process operates on a sequential trace basis, and includes a check to verify that the current FFID and channel information described in the OPFs matches the FFID and channel information found on each trace of each ensemble. The Inline Geom Header Load process will fail if these numbers do not correspond. You must then correct the situation by changing the geometry found in the OPFs, or possibly by changing the input dataset attributes.
2-48
Landmark
Chapter 3
Landmark
3-1
Geometry Extraction
For reprocessing, extraction can read information for building the database from the input trace headers. If only part of the information is available, the database completion is performed subsequently within the spreadsheets. If all necessary information is available, the database can be completed in the extraction step without having to work with the spreadsheet or binning tools.
Geometry Extraction
O.B. Notes UKOOA ASCII Field Data SEG-? Input
The full extraction process makes one very critical assumption in that there must be some unique trace header value for all traces of the same shot and receiver. That is, there must be unique source and receiver position numbers, FFIDs, Coordinates, or Date/Time stamps in addition to the recording channel numbers.
3-2
Landmark
inline 1 xline 1
Population After 1st Half Extraction
Landmark
3-3
After loading the second half of the data, the entire subsurface grid will be populated.
inline 1 xline 1
3-4
Landmark
Create a New Line and Run the First Extraction 1. Since you are going to create a new database, the rst thing you need to do is to create a new line in your area. This line will be another copy of the Manhattan3d Project built using Full Extraction. Enter a a line name similar to Manhattan 3d - extraction. 2. Build the following ow in your new line:
SEGY Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: ------------------------------------------------------/misc_les/3d/manhattan_rst_half MAX traces per ------------------------------------------------240 Remap SEGY header values ---------------------------------Yes Input/override trace header entries --------------------------------------------sou_sloc,,4I,,181/srf_sloc,,4I,,185/
Landmark
3-5
3. In the SEGY input read the le as described by your instructor This will be the 1st half of the shots. There are 240 traces per ensemble and you must remember to remap the SOU_SLOC and SRF_SLOC values from the extended SEGY headers to the ProMAX trace headers. In this menu the default values for remapping SOU_SLOC and SRF_SLOC will work ne. 4. Remove the specication for the CDP_SLOC but make sure you leave the last /. 5. Select the Extract Database Files parameters. This is a Land 3D project where you will identify all traces coming from a common source by their FFID number and all traces recorded at the same receiver using the receiver station number. You will overwrite any previous database information, if any, and do full extraction instead of pre-geometry extraction. Do not extract the CDP or OFB binning, this will be calculated and applied later. 6. In Disk Data Output, enter shots - raw data (1st) for a new output le name. 7. Execute the ow. Extract Database Files does all of the work. The program reads the trace header information and establishes all of the necessary OPF domains and their attributes. The dataset is then stamped with valid trace numbers, permitting further processing with a consistent pairing between the OPFs and the trace headers in the dataset. (This is an important concept for the Inline Geom Header Load process which will be run once the geometry is completed.)
3-6
Landmark
The setup menu allows you to define global information applying to the configuration and operation of the Geometry Spreadsheet. 3. Select the Assign Midpoints Method of Existing index number mappings in the TRC which is the default). In the extract process, a shot and receiver location value and x,y coordinate were extracted from each trace header. Every trace knows which shot and receiver it belongs to so the binning can be done by using existing values in the TRC ordered database le.
Landmark
3-7
4. Enter 110 for the nominal receiver station interval and crossline separation and do not ll out the source station interval. These numbers are only used for QC purposes. 5. Set the nominal survey azimuth to 5.6 degrees. 6. Answer No to Base Source station coordinates upon a match between source and receiver station numbers. In our case the sources are numbered 1001-1085 and the receivers are 1-720. There is no relationship. 7. Select Shot holes for source type. 8. Set units to Feet
3-8
Landmark
9. Do not specify reference coordinates. 10. Specify the font assignment of your choice. 11. Click OK.
QC the Input with a Basemap 12. Open the Receivers Spreadsheet and generate a Basemap using the View View All Basemap pull down menu.
Note: Only the shots and receivers on the East side of the project exist on this map.
Landmark
3-9
13. Use the Cross domain (double fold)icon to see which receivers are dened as live for each shot. 14. Close the Basemap and the Receiver Spreadsheet windows.
Trace Assignment This exercise illustrates CDP binning procedures. For this example we will automatically compute a CDP grid based on some initial known values and then apply the grid using the batch CDP Binning* process. 1. In the main Geometry window, click Bin. A submenu appears with options for Assigning the traces to midpoints, dening the bin grid, binning the data, quality controlling the binning, and nalizing the database.
2. Select Assign midpoints by using Existing index number mappings in the TRC, and click Ok. In this case the Assignment step is performing the following calculations: Computes the Shot to Receiver Offset (Distance) Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
3-10
Landmark
Note: Because we ran the full extraction, every trace already knows which shot and receiver it contributes to.
3. An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click Proceed.
4. A number of progress windows will ash on the screen as this step runs. A nal Status window should notify that you have Successfully completed geometry assignment. Click Ok.
Automatic Bin Calculation and QC 5. From the 3D Binning and QC window select Bin Midpoints.
Landmark
3-11
This window controls two binning operations: CDP subsurface Binning Offset Binning Lets worry about one binning operation at a time. First, lets do the CDP binning.
CDP Bin Origin and Direction ProMAX geometry assignment allows you to define your binning parameters so any corner of the project can be the origin of the inline, crossline and CDP numbering. You may also choose the inline and crossline directions. There are three rules by which you must abide: The Y axis is always parallel to the specied azimuth.
3-12
Landmark
The X axis is always 90 degrees clockwise from the Y axis. The grid cell X and Y dimensions must be input as positive numbers.
Landmark
3-13
Inline 1
Xline 1 CDP 1
5.6o
X o 275.6
Inline 1
Xline 1 CDP 1
This Case
3-14
Landmark
6. In the 3D Land Midpoint Binning window set Azimuth=275.6, Grid Sizes=55 in each direction, supply a Bin Space Name such as Calculated CDP bins 55x55-origin in SE corner.
7. Click Calc Dim. The Calc Dim operation computes the origin of the grid and the maximum X and Y dimensions to yield the following:
Landmark
3-15
Calculated
8. Save the grid denition by clicking Save. 9. Click Cancel to exit the window.
QC, Edit and Save the CDP Binning Grid 1. Select Dene binning grid from the main binning window and click Ok.
3-16
Landmark
2. Select Display Midpoint Control Points Black (depending on the color of the background).
Landmark
3-17
Midpoint Scattergram for CDP Binning 3. Select Grid Open and the grid name that you saved from the Calc Dim operation.
3-18
Landmark
X marks
grid
origin
Scattergram with Grid Overlay The computed grid exactly overlays the existing subsurface coverage. Notice that this grid is 26 lines with 78 bins per line. We need to prepare ourselves for the extension of the project when the data becomes available. According to the supplier of the data we will eventually have 42 lines with 79 CDPs per line. The rst line is on the East and does not pose a problem since we can simply add more lines to the grid. The problem is that we do not know if we need to add the extra bin on the north or south to size the grid at 79 bins per
Landmark
3-19
line. We can make an educated guess by looking at the basemap of the entire proposed project and expand the grid as required. 4. In order to gure this out we will have to refer back to a master Basemap that was provided by the supplier of the data.
Total Project Basemap From this map we will have to decide whether the shots on the North-East or the South-West are further away from the cable lines. From this map it appears that the distance Y is greater than the distance X which implies that we need to add at least one additional crossline to our grid on the South.
3-20
Landmark
5. The easiest way to add the crossline at the South is to use the Add or Delete cells icon.
Additional Grid Cell on the South 6. Add a cell at the South by clicking MB1 (add a cell) near the south edge of the existing grid. 7. You may also elect to do some selective zooming in combination with the Grid Drawing functionality to carefully position the grid so the midpoints are centered in the cells.
Landmark
3-21
8. Save your edited using the Grid Save to pull down menu.
9. Enter a new grid name and click the Ok. 10. Exit the XYgraph display using the File Exit Conrm pull down menu.
3-22
Landmark
Reload the edited CDP Grid and Complete CDP Binning 1. Return to the 3D Land Midpoint Binning Window and select Load to bring back your edited grid.
Note: The Offset Bin increment and the Inlines parallel to grid axis have been reset.
Landmark
3-23
Finalize the Offset Binning and CDP Bin Direction Parameters 1. Set the Min offset to bin=0, Offset binning increment to 110 ft and select Inline to be parallel to the X axis (perpendicular to azimuth).
2. Click Apply to perform the Binning. 3. When complete, dismiss the notication window and click Cancel.
3-24
Landmark
Generate a Fold QC Plot and Finalize the Database 1. Return to the 3D Binning and QC window and select QC Bin data, select your grid for the QC Bin space and click Ok.
2. This should generate a QC fold plot that shows live CDPs on the northern most crossline and zero fold CDPs in the south.
Live CDPs
Landmark
3-25
3. Exit the XYgraph display using the File Exit Conrm pulldown menu. 4. Return to the 3D Binning and QC window and select Finalize the database and click Ok.
5. When complete, dismiss the notication window and click Cancel. 6. Select File Exit from the main geometry spreadsheet window to exit.
3-26
Landmark
Landmark
3-27
5. Execute this ow. 6. After the ow has completed, go to the datasets list and press MB2 on the le name. The dataset should now reect that both the geometry and trace numbers match the database.
3-28
Landmark
SEGY Input
Type of storage to use: ----------------------------- Disk Image Enter DISK le path name: -------------------------------------------------/misc_les/3d/manhattan_second_half MAX traces per ensemble: ------------------------------------240 Remap SEGY header values ---------------------------------Yes Input/override trace header entries --------------------------------------------sou_sloc,,4I,,181/srf_sloc,,4I,,185/
Landmark
3-29
This will be the 2nd half of the shots. There are 240 traces per ensemble and you must remember to remap the SOU_SLOC and SRF_SLOC values from the extended SEGY headers to the ProMAX trace headers. In this menu the default values for remapping SOU_SLOC and SRF_SLOC will work fine. 3. Remove the specication for the CDP_SLOC but make sure you leave the last /. 4. Select the Extract Database Files parameters. This is a Land 3D project where you will identify all traces coming from a common source by their FFID number and all traces recorded at the same receiver using the receiver station number. In this execution we will Append the new information to an existing database, and do full extraction instead of pre-geometry extraction. 5. In Disk Data Output, enter shots - raw data (2nd) for a new output le name. 6. Execute the ow.
3-30
Landmark
Landmark
3-31
QC the Input with a Basemap 2. Open the Receivers Spreadsheet and generate a Basemap using the View View All Basemap pull down menu.
Note: All of the shots and receivers for the entire project now exist on this map.
3. Use the double fold or cross domain icon to see which receivers are dened as live for each shot. 4. Close the Basemap and the Receiver Spreadsheet windows using the File Exit Conrm and File Abort pull downs respectively.
3-32
Landmark
A dialog box appears with options for Assigning the traces to midpoints, dening the bin grid, binning the data, quality controlling the binning, and nalizing the database.
2. Select Assign midpoints by: Existing index number mappings in the TRC, and click Ok. In this case the Assignment step is performing the following calculations: Computes the Shot to Receiver Offset (Distance) Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
Landmark
3-33
Note: Because we ran the full extraction, every trace already knows which shot and receiver it contributes to. The assignment step also re-assigns all traces, it does not know about the APPEND action that we performed.
An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click Proceed.
A number of progress windows will ash on the screen as this step runs. A nal status window should notify that you successfully completed geometry assignment. Click Ok. If this step fails, you have an error in your spreadsheets somewhere. Not much help is given to you, but the problems are usually related to the spread and/or pattern denitions.
Expand the CDP Binning Grid In this sequence we need to add more lines to the existing CDP grid. We need to make sure that we do not alter the X-Y coordinates of the previously existing bins by rerunning the Calc Dim. If we did this we would probably change the bin centers which would invalidate the geometry loaded to the first half trace headers. Part of the strategy here is to avoid having to reload the geometry to the first half. By simply adding lines to the bin grid we are not changing the trace numbers, SIN, SRF, CDP, ILN, XLN, OFB or any other attribute of the traces that have already been processed.
3-34
Landmark
1. Select Dene binning grid from the main binning window and click Ok.
This will bring up a small map window. 2. Select Display Midpoint Control Points Black (depending on the color of the background).
Landmark
3-35
Mid-point Scattergram for CDP Binning 3. Select Grid Open and select the grid name that you saved as the nal CDP Binning Grid after manual editing.
3-36
Landmark
Half Grid Overlay The grid covers the Eastern part of the survey but must be extended to cover the entire survey 4. Click the Modify the number of grid cells icon and move your mouse cursor to the western edge of the project. 5. Click MB2 and watch the grid extend to the cursor. 6. Add or delete cells as required.
Landmark
3-37
The nal grid should be at an azimuth of 275.6 degrees and have 79 cells in the X direction and 42 in the Y direction. The cells should be 55 ft in each direction. 7. Save the grid from the XYgraph window using the Grid Save to pulldown menu.
8. Enter a new grid name and click Ok. 9. Exit from the XYgraph by selecting File Exit Conrm.
3-38
Landmark
10. Select Bin midpoints and click Ok from the 3D Binning and QC window.
12 11
11. Load the grid for the entire project, and reset the offset binning and inlines parallel to the grid X axis parameters. 12. Click Save to save the edited grid information and then click Cancel. 13. Close the 3D Binning and QC window by clicking Cancel. 14. Select File Exit from the main spreadsheet menu to exit the Geometry Spreadsheet.
Complete CDP Binning using the Batch CDP Binning Tool This exercise completes the CDP binning and database finalization steps.
Landmark
3-39
CDP Binning*
Binned Space Name ------------------------ your grid 2. Select the grid you created for the entire project. 3. Execute the ow. This process will perform the CDP binning and Finalization steps in a batch job instead of interactively using the Geometry spreadsheet. 4. When the binning ow completes, generate a QC plot from DBTools by selecting View Predened CDP fold map.
3-40
Landmark
5. Click on the 0 fold bar in the histogram to display the CDPs with zero fold. 6. From the DBTools display select View Close and then Database Exit.
Landmark
3-41
3-42
Landmark
5. Execute this ow. 6. After the ow has completed, go to the datasets list and press MB2 on the le name. The dataset should now reect that both the geometry and trace numbers match the database.
Landmark
3-43
Exercise Summary
The following points are important to note: The rst extraction was run in Overwrite mode. The CDP grid was dened to cover the existing project area. Since we ran the extraction and output a dataset we could load the geometry by existing valid trace numbers. The second extraction was run in Append mode. We expanded the CDP grid after the second extraction making sure not to alter the grid that was used for the rst extract except to add bins for the new inlines. The second geometry assignment had no knowledge of the append and therefore reassigned all traces in the database. After running the second assignment step we have to completely rebuild the Trace database. The dataset for the rst half, however still matches the database because we did not change the trace numbers, SIN, SRF, CDP, ILN XLN, or OFB that the traces from the rst half contribute to. All we did was add new traces. After the second execution of the Inline Geom Header Load we now have two separate datasets, or Superswaths that we can use to continue processing.
3-44
Landmark
Full Extraction
In the previous exercise, our trace data headers did not contain binning information so we had to use the Geometry Spreadsheet to create it. However, if you receive data with trace headers that include: Source X, Y coordinates and station number, Receiver X, Y coordinates and station number, CDP X, Y and CDP number, Iline and Xline numbers, and Offset bin number,
you can use the Extract Database Files tool to automatically build the CDP, ILN, XLN and OFB orders. We refer to datasets with this information as having fully-populated trace headers. Once the Extract flow is complete, you will still need to open the LIN ordered database file editor, unlock the protected fields, and complete the Binning Parameters section. It may be helpful to generate an XYGraph display of CDP: X_Coord, Y_Coord, and Fold from XDB Database Display and use the Grid option to get the azimuth and coordinate information. Since the geometry information is already in the trace headers, you will not need to run Inline Geometry Header Load if you write a new file after the Extraction. This output file is automatically stamped as matching the database. This approach is best suited to datasets that have been previously processed in ProMAX but can also be used on SEG- ? formatted data if the trace headers are remapped properly during input. In this exercise, you will read an existing ProMAX format file and extract the geometry from the headers to build a database.
Create a New Line and Run the First Extraction 1. Since you are going to create a new database, the rst thing you need to do is to create a new line in your area. This line will be another
Landmark
3-45
copy of the Manhattan3d Project built using Full CDP Extraction. Enter a a line name similar to Full CDP Extraction. 2. Build the following ow in your new line:
3-46
Landmark
Edit the LIN Database After the extraction has completed, open DBTools by clicking on Database. 1. From the DBTools main window , select View Lin 2. From the LIN editor main window , select Database Lock protected elds
3. Notice that the CDP, SIN, SRF, and TRC counts are accurate.
Landmark
3-47
4. Scroll to the bottom of the LIN Database Editor window and notice that all of the grid parameterization is NULLed. You will need to enter this information manually.
The Grid information is essential to process these data through processes such as Stack3D, 3DDmo and 3D Migration, as well as velocity field processing.
3-48
Landmark
Minimal Database The LIN OPF entries required in a minimal database are:
Entry 3-D ag Marine ag Minimum CDP number Maximum CDP number CDP number increment Number of inlines (3D) Attribute Name I3Dz IMARINEz MINCDPz MAXCDPz INCCDPz NILINESz
Landmark
3-49
Entry Number of crosslines (3D) Minimum inline number (3D) Maximum inline number (3D) Distance between CDPs in inline direction (3D) Minimum crossline number (3D) Maximum crossline number (3D) Distance between CDPs in crossline direction (3D) X origin of 3D grid Y origin of 3D grid X coord of far end of rst inline Y coord of far end of rst inline
Attribute Name NXLINESz MINILINz MAXILINz DCDPILNz MINXLINz MAXXLINz DCDPXLNz X3DORIGz Y3DORIGz XILNENDz YILNENDz
ProMAX will attempt to resolve (x,y) coordinates in the following sequence: First, it will look for a valid (x,y) coordinates in the trace headers. If that fails, it will attempt to resolve (x,y) using the LINordered parameter file. Finally, it will revert to looking up (x,y) in the database. Most ProMAX tools only need a minimal database in addition to the (x,y) coordinates in the trace headers. If none of the LIN parameters have been defined, only tables with primary keys other than CDP can be created manually or picked interactively and they will be interpolated linearly based on the primary key. If the 3D flag is set in the LIN OPF, tables with primary keys other than CDP can be picked interactively and will resolve (x,y) coordinates from the data trace headers yielding true x,y interpolation. Once the LIN parameters have been defined in the limited database, CDP-ordered tables will be interpolated with true x,y positions.
3-50
Landmark
Full Database
The following tools require a full and complete database:
Geometry Database/Header Compare Merge Database Files Database/Header Transfer Database Parameter Merge CDP Taper Assign CDP Flex Binning Expand Flex Binning Inline Geom Header Load Graphical Geometry QC Source Receiver Geom Check Editing/Muting/Noise NN Trace Editor Training NN Reversed Trace Training NN Trace Editor NN Reversed Trace Editor Trace Statistics (unless run on Headers Only) Ensemble Statistics Amplitude/AVO Statics Surface Consistent Amplitudes Apply Elevation Statics Datum Statics Calculation Datum Statics Apply Apply User Statics
Landmark
3-51
Refraction/Residual Statics
Apply Refraction Statics Apply Residual Statics Apply Trim Statics First Break Picking (unless run on Headers Only) NN First Break Picker 3D Ref Statics Model Refraction Statics Calculation 3D Reection Correlation Autostatics 3D Ref Statics Inversion 3D Ref Statics Computation External Model Correlation Solve Integrated Statics EMC Autostat: Gauss-Seidel EMC Autostat: Xcor Sum 2D/3D Max. Power Autostatics
Deconvolution
Display
Database To Zycor ASCII Make Database Basemap Contour Database Parameter View Database Basemap
Miscellaneous
3-52
Landmark
Chapter 4
Landmark
4-1
Geometry
full extraction complete database load geometry
Trace Processing
preprocessing brute stack
refraction statics
residual statics
velocity analysis
Imaging
dip moveout
nal stack
migration
4-2
Landmark
Stack data with refraction statics applied 4 ms 16 bit Build External Model ow
8 ms 8 bit
Pre-Stack data with 8ms NMO and refraction 8 bit statics applied residual statics ow
Pre-stack data with refraction statics applied 4 ms 16 bit residual statics stack ow
Landmark
4-3
Pre-stack data with 4 ms refraction statics applied 16 bit Final Velocities before DMO Prepare data for 3D DMO To Gathers 4 ms 16 bit Pre-stack data with nal statics, NMO and MUTE
8 ms 8 bit
Pre-stack data with nal statics, NMO, MUTE, DMO inverse NMO, BP and AGC (data on vel lines only) Velocity Analysis ow after DMO
4-4
Landmark
Pre-stack data with refraction statics applied 4 ms 16 bit Final Dip independent RMS Velocity Field Prepare data for Stack DMO
4 ms Pre-stack data with nal 16 bit statics, dip independent NMO and MUTE DMO to Stack 3D ow
Final Dip Independent RMS Velocity Field Velocity Model Conditioning for Migration Stack with nal statics, velocities, and DMO applied Migration Velocities Migration ow
4 ms 16 bit
Landmark
4-5
4-6
Landmark
Chapter 5
Landmark
5-1
Identifying Analysis Locations You will need to determine which shots to use when you pick your tables and check the need for space variance. You can generate a map of the shot locations and pick a few shots based on their SOURCE numbers. This introduces a major difference between ProMAX 2D and 3D. In ProMAX 3D all parameter tables are interpolated based on their X and Y locations. In ProMAX 2D all interpolation is done linearly by primary sort key. Another issue here is that you may end up having to read many tapes to capture the shots of interest. In this flow we will output a dataset with just the selected shots. This dataset will come in handy several times during the course of the processing exercise. We will use the dataset to pick the parameter tables, train the neural network and as input to the Datum Statics Apply flows. Having a few shot records immediately available on disk will be a valuable resource and make parameter testing run much faster. 1. From the line level, click Database to bring up DBTools and generate the following display from the SIN order: Select View2D Matrix... from the DBTools main window pulldown menu.
5-2
Landmark
When the Create 2D Crossplot Matrix attribute selection window appears, click the tab for the SIN order. Next, select the X_COORD, Y_COORD, SOURCE, SOURCE attributes and click OK..
1071
1014
1067
1081
1001
Shots for Parameter Table Picking 2. Enable Sample Tracking and point to a few shots and note the their SOURCE numbers.
Landmark
5-3
You may elect to view 5 shots; one from each corner and one from the center of the project. For example, use SOURCE numbers 1001, 1014, 1067, 1071 and 1081.
5-4
Landmark
Pick a Top Mute and Miscellaneous Time Gate 1. Build the following ow:
Landmark
5-5
Sort the input based on Source Numbers with a secondary sort of channel. Select a few shots around the project to QC the space variance of the parameter picking tables. 3. Output a dataset to disk containing your ve 5 shot records. 4. Reread the 5 test shots le sorting by OFFSET
Note: In 3D, offset and Aoffset are equal since there are no negative offsets.
5. All parameters in the AGC can be defaulted. 6. Select to plot 5 shots and use the Grayscale color scheme in the Trace Display. 7. Execute the ow. 8. Pick a rst break suppression mute using the Picking Pick Top Mute... pull down to remove the rst arrivals.
Note: Use one of the shots from the corners for picking since the middle shot only contains half of the offsets. Project to the other shots and repick if required.
5-6
Landmark
9. Pick a miscellaneous time gate using the Picking Pick Miscellaneous Time Gates... pulldown to use as a time window for the deconvolution design gate.
Example Mute and Design Gate 10. Exit and stop the ow by selecting File Exit/Stop. 11. A window will pop up asking if you want to save the edits. Click Yes to save the mute and time gate that you just created.
Landmark
5-7
5-8
Landmark
Build a Flow to look at a power spectrum before and after decon 1. Build the following ow:
Landmark
5-9
Interactive Spectral Analysis - Simple Mode You can control the contents of the display by using the View Visibility pull down menu. You can then select the individual tiles of interest. 2. Exit from the display using the File Exit/Stop Flow pull down menu. 3. Edit the parameters for Interactive Spectral Analysis and select Single Subset instead of Simple for the Data selection method. 4. Execute the ow again.
5-10
Landmark
Interactive Spectral Analysis - Single Subset Mode In this mode you can select a Single Subset of the available data for the purposes of computing the average power and phase specta. You can change your mind about the single subset as many times as you want to.
Landmark
5-11
5. Click on the Select Rectangular Region Icon and then draw a box around an area of interest. The data window and spectral windows will change conguration to match your data selection.
6. Exit from the display using the File Exit/Stop Flow pull down menu. 7. Edit the parameters for Interactive Spectral Analysis and select Multiple Subsets instead of Single Subset for the Data selection method. Also select Yes to Freeze the selected subsets. 8. Execute the ow again.
5-12
Landmark
You should get the following display:Interactive Spectral Analysis Multiple Subset Mode
Landmark
5-13
9. Select the Select Rectangular Region icon and draw a box around an area of interest and then select Options Spectral Analysis from the pulldown menu.
10. If you select a new area and reselect Options Spectral Analysis, a new window will appear. In this way you can compare the spectral results for different areas.
5-14
Landmark
IF
Trace Selection MODE-------------------------------------Include SELECT Primary trace header word -------------- REPEAT SPECIFY trace list ----------------------------------------------------1
ELSEIF
Trace Selection MODE-------------------------------------Include SELECT Primary trace header word -------------- REPEAT SPECIFY trace list ----------------------------------------------------2
Trace Muting
Select mute parameter le ---------------rst break mute
Spiking/Predictive Decon
Use all defaults except... Select decon gate parameter le --------------decon gate
13. Click on the Data Next Data pulldown menu to display the data after decon. 14. Click the Next Screen icon and select the Analysis Options Spectral Analysis pull down menu again to show the spectral estimate for the data after decon. You can experiment with selecting subsets of the shot record before and after decon. Notice how it remembers the selection window as you change from one shot the next.
5-16
Landmark
Elevation Statics
Datum static corrections are generally required for land data to compensate for adverse travel-time effects of topography and variations in weathering thickness and velocity. The process of calculating and applying datum statics within ProMaX includes the following steps: Compute static time shifts to take the seismic data from their original recorded times, to a time reference as if the data were recorded on a nal datum (usually at) using a replacement velocity (usually constant). Compute a oating datum (N_DATUM), a smoothed surface used as the processing datum or NMO datum. Partition the total statics into two parts, the Pre (before) NMO term and Post (after) NMO terms relative to N_DATUM. Apply the Pre (before) -NMO portion of the statics and write the remainder to the trace header.
The first three steps occur in the calculation phase and the last step in the apply phase. The calculation phase uses the your parameters in combination with the information in the database and then results are saved in the database. The apply phase reads the information from the database and transfers it to the trace headers. ProMAX offers several options for both phases; which option you should use depends on how you are processing your data.
Apply Elevation Statics The first option is to simply add Apply Elevation Statics to your flow. Apply Elevation Statics, despite its name both calculates and applies the elevation statics. Because it both reads from and writes to the database, which is shared amongst all the datasets within the Area/Line, you could have a problem if you attempt to run more than one instance at the same time. Therefore, if you are processing a large project in parts, as we are for this training class, you will need to wait for Apply Elevation Statics to complete before you run it again for the other datasets in your project. When you run Apply Elevation Statics again for the additional dataset parts, you will automatically recalculate the datum statics in the
Landmark
5-17
database for the entire project, even though you are only updating the headers for the input dataset. In a large project, the time spent doing the redundant datum statics calculation can be substantial especially if combined with having to wait to get access to the database.
Datum Statics Calculation and Datum Statics Apply To help alleviate these problems, Apply Elevation Statics was split into two separate modules, Datum Statics Calculation and Datum Statics Apply. In a typical workflow for large volume land processing, you would run Datum Statics Calculation once to update the entire project database and then run Datum Statics Apply for each dataset comprising the project. Since Datum Statics Apply only reads the precalculated and saved information in the database and transfers it to the trace headers, you avoid repeating the calculation phase in Apply Elevation Statics so processing time is saved and the possibility of having several flows trying to write to the database at the same time is eliminated. In addition, Datum Statics Calculation offers the ability to run multiple times and save the output from each run under a unique Run ID. This feature is handy when you wish to compare the results using different parameters as we will do in the next exercise. Before we begin the exercise, let us look at ProMAX datum statics terminology and the calculation algorithms in more detail.
Datum Statics Terminology With ProMAX datum statics, you have the option to shift prestack data to a floating datum or a final datum. You supply a final datum elevation and a replacement velocity. The elev_stat_math file then establishes values in the database for F_DATUM, N_DATUM, S_STATIC, R_STATIC, and C_STATIC. Details of this process can best be understood by examining the contents of the elev_stat_math file. This file typically resides in $PROMAX_HOME/port/misc/elev_stat_math. Elevation statics then creates three new header entries: NMO_STAT, FNL_STAT, and NA_STAT. The integer multiple of a sample period portion of NMO_STAT shifts traces to the floating datum in the apply phase. The fractional sample period portion is written to the NA_STAT header entry and applied later.
5-18
Landmark
If you select to process to a final datum, C_STATIC is set to zero. Since NMO_STAT = S_STATIC + R_STATIC + C_STATIC and C_STATIC = -1.0*FNL_STAT, NMO_STAT is the static that shifts traces to the final processing datum, and FNL_STAT is zero because your data are at the final datum.
Landmark
5-19
S.P.
CDP Receiver
S_STATIC F_DATUM
C_STATIC
R_STATIC
Database Attributes:
N_DATUM = oating datum F_DATUM = nal datum S_STATIC = (F_DATUM - ELEV + DEPTH) / DATUMVEL R_STATIC = [(F_DATUM - ELEV + DEPTH) / DATUMVEL] - UPHOLE C_STATIC = 2 * [(N_DATUM - F_DATUM) / DATUMVEL]
5-20
Landmark
Comparison of Smoothed Surfaces based on CDP Smoothing 3D processing frequently requires you to compare the results of various smoothing parameters on the floating datum surface. In the next exercise we will calculate elevation statics using various settings for the smoothing parameters and compare the surfaces using the Wire Frame display in 3D XDB.
Build and Execute a Flow to Compute the N-Datum 1. Build the following ow:
Landmark
5-21
appears, click 3D on the menubar and then your area, line, CDP, X_COORD, Y_COORD, and either ELEV or N_DATUM.
51 point smoother
5-22
Landmark
4. Rerun the same ow but change the smoothing parameter to 15 increment the Run ID to 02 and regenerate the wireframe plot of the new N_DATUM:
Landmark
5-23
5. Compare these displays and decide on a value that does sufcient smoothing without greatly changing the local elevation.
NOTE: This project is very small with little elevation change. In normal production the default smoother may be adequate. It is still worth looking at a the displays of ELEV vs. N_DATUM after calculating the elevation statics for QC. This is especially true in areas of highly variant surface elevations.
One major criterion that you might use to help diagnose a good value is to look at the value of N_DATUM in the area of a proposed supergather for velocity analysis. You would prefer that all CDPs in a supergather have the same (or very similar) N_DATUM value.
5-24
Landmark
Superswath Processing
The idea of splitting a data set into multiple parts and processing the parts simultaneously has proven to be a valuable technique for processing efficiently. A superswath is simply a subset of the entire dataset after a particular processing step. Instead of processing the entire data volume as one dataset, the volume is split up into several smaller, more manageable datasets.
Swath 1 Swath 2 Swath 3 Bundle the data sets into easy to manage partitions. Swath 4 Swath 5 Swath 6
ProMAX Process Flow Process each partition separately up to Stack or DMO Stack
The superswath strategy allow you to exploit the available hardware and parallel processing techniques.
Landmark
5-25
Preprocessing Flows
The following exercise produces preprocessed, prestack data with elevation statics applied. In this set of exercises we will actually run two flows simultaneously. The first flow will process the first half of the shots. The second flow will be identical except that it will process the second half. Since the elevation statics were already calculated in the database, we only have to apply them to dataset trace headers using Datum Statics Apply.
Build a Flow to Perform the Preprocessing on the First Half 1. Build the ow outlined on the next page: 2. In Disk Data Input, input the rst half shot le. 3. Select the mute you picked in Trace Mute. 4. Set True Amplitude Recovery parameters. Apply spherical divergence using the time-velocity pairs given below and a 6db/sec gain correction. Select No for velocity input using a parameter table and enter the following time velocity pairs: 0-9700,500-11200,1200-12500,2000-14000 5. Set Spiking/Predictive Decon parameters. You can use all of the default parameters except that you need to input a previously picked miscellaneous time gate for the decon design gate. 6. Bandpass Filter (Optional). You can apply a bandpass lter in the decon process if desired. 7. In Datum Statics Apply select the source, receiver, and CDP statics parameter corresponding to the Run ID with the best smoothed surface.
5-26
Landmark
Trace Muting
Select mute parameter le ---------------rst break mute
Spiking/Predictive Decon
Use all defaults except... Select decon gate parameter le --------------decon gate
Label the Dataset and Output to Disk 8. Add Trace Display Label. Use text similar to decon and elev statics.
Landmark
5-27
9. Set the Disk Data Output parameters. Output a new le, describing the volume as shot organized with decon and elevation statics applied for the rst half of the data volume such as shots - preprocessed (1st) 10. Execute the ow.
Build a Flow to Perform the Preprocessing on the Second Half 1. Copy the ow that you just built to a new ow:
Datum Statics Apply True Amplitude Recovery Trace Mute Spiking/Predictive Decon Trace Display Label Disk Data Output
Output Dataset ------------------shots-preprocessed (2nd) 2. Change the input to your le for the second half. 3. Add a new output le for the preprocessed second half data. 4. Execute the ow.
NOTE: Using this technique you can execute these two ows simultaneously on the same, or different machines.
5-28
Landmark
Chapter 6
Landmark
6-1
3. Enter the description name for your imported velocity. Use a name similar to imported from ascii file. This opens a parameter table editing window in the form of a spreadsheet. 4. Select File Import.
6-2
Landmark
This opens two more windows, an empty viewing window and a le selection window.
5. Input the absolute path name to the directory where the velocity le is stored and append a /* to the end of the path (/misc_les/3d/*). Click Filter. This lists the les and directories stored in the specied directory. 6. Select the le as indicated by your instructor and click OK. This opens the le and shows the contents in the Import viewing window.
7. Click Format.
Landmark
6-3
8. Enter a new format denition name or select a previously dened format (you probably do not have one yet). 9. Click OK. Another window will open listing CDP, X Coor, Y Coor, Inline, Xline, TIME and VEL_RMS.
10. Click CDP and then drag the mouse over the appropriate columns on the import le window to dene the columns containing the CDP value.
6-4
Landmark
13. Select Overwrite ALL existing values with new import values and click OK.
Landmark
6-5
NOTE: The input le contains velocity functions using the 3D CDP number as reference. ProMAX 3D requires that all parameter tables be referenced to X and Y coordinates.
14. Use the Edit Resolve pull down menu to compute the X and Y coordinates for each 3D CDP number. 15. Select Coordinates from the CDP ensemble numbers. 16. Use the Edit Resolve pull down menu to compute the Inline and Crossline values for each 3D CDP number.
6-6
Landmark
All columns should now be complete. This input table shows velocity functions on Xlines 20, 40 and 60 on Inlines 5, 15, 25 and 35 for a total of 12 velocity functions set on a regular grid. 17. Click on File Exit to save the parameter table and exit the editor. 18. Check the table by going back to the list of tables from the User Interface and select to edit the table 19. Click Edit and then select the table name.
Notice that the table does not contain the Inline and Cross line values that we resolved for it. This is normal behavior. The Inline and Crossline numbers are not stored with the table. 20. Select File Abort to exit the parameter table editor.
Landmark
6-7
6-8
Landmark
Interpolating velocity tables is a two step operation. A value at each of three velocity nodes is found at the desired time and then the velocity is interpolated using the Delauney Triangle approach.
y t
b a p c
Landmark
6-9
Inline Sort
PRIMARY sort key -----------------------------CDP bin number SECONDARY sort key -----Signed source-receiver offset Maximum traces per output ensemble----------------------30
Ensemble Stack/Combine Normal Moveout Correction Bandpass Filter Automatic Gain Control Trace Display
6-10
Landmark
Disk Data Input Disk Data Insert Inline Sort Ensemble Stack/Combine
Type of operation --------------------------------- Combine Only Input ensembles per output ensemble -----------------------7 Maximum traces per output ensemble -------------------200 Warnings if max traces/ens exceeded?------------------Yes Primary Trace Order Header Word -------------------- (CDP) ---------------------------------------------------------CDP bin number Average the primary key values?-------------------------- Yes Average the X and Y coord. of the primary key--------Yes SECONDARY Trace Order Header Word--------(OFFSET) -------------------------------------- Signed source-receiver offset Output trace secondary key order -------------- Ascending
Bandpass Filter
All default values are acceptable
Trace Display
Primary trace LABELING ------------------------------------ CDP Secondary trace LABELING --------------------------- OFFSET
Landmark
6-11
1. In the Disk Data Input, input the rst half shot organized le that you created with TAR and Decon applied. Select to only read 7 CDPs starting at 3D CDP number 1615. 2. In the Disk Data Insert, input the second half shot organized le that you created with TAR and Decon applied. Select to only read 7 CDPs starting at 3D CDP number 1615. You can make a display from the Database to help select the CDP numbers. Make a 3D: XYGRAPH: CDP: ILN,XLN,FOLD display and then convert the inline, crossline coordinates to CDP numbers. In this case the equation would be (IL-1)*79+XLN. 3. Use the Inline Sort process to rebuild continuous CDP ensembles since the selected CDPs are in the overlap zone between the two les. 4. Use Ensemble Stack/Combine to build one ensemble of all the input traces. Select to join the 7 CDP ensembles into one ensemble and order the traces by OFFSET. Specify a high number of traces per output ensemble (200). Remember to average the CDP numbers and X, Y coordinates. 5. Apply NMO using your best velocities available. Remember to set the stretch mute to 0.0, disabling it. 6. Apply a Bandpass Filter and AGC for data enhancement. The default parameters will be adequate. 7. Display the single ensemble. 8. Pick a top mute to remove any unwanted data interpolating the times as a function of AOFFSET.
6-12
Landmark
9. Using the Picking Pick Top Mute... pull down menu input a new mute table name such as Post-NMO mute (brute) and click OK.
10. Select to interpolate the time picks as a function of AOFFSET. 11. Pick the mute.
Landmark
6-13
Be careful not to get to close to zero offset at time zero. Normally, you will need to keep the near traces intact for the stack of the shallow section.
12. Select File Exit/Stop Flow to save the table, exit, and stop the ow.
6-14
Landmark
Stack 3D
Sorting data for a large 3D volume can be time consuming and expensive. ProMAX 3D offers the capability to generate partial CDP stacks from input files of any primary ensemble and merge these partial stacks together into one final CDP stack data volume. Of course, if all traces are input to a single Stack 3D execution, no merging is necessary. In the following exercises we will create two separate partial stacks and merge them together.
42
16
26
Landmark
6-15
Trace Muting
SELECT mute parameter le ---- Post-NMO mute (brute)
Stack 3D
Enter name of host ----------------------------Number of worker threads----------------------------------------1 Restart with an existing stack? ------------------------------No Minimum inline number -------------------------------------------1 Maximum inline number -----------------------------------------26 Minimum crossline number---------------------------------------1 Maximum crossline number ------------------------------------79 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Apply nal datum statics after stack? -------------------yes Size of input trace memory buffer (MB)---------------------- 4 Size of stack trace memory buffer (MB) ----------------------4
with decon and applied elevation statics applied. 3. Before selecting the le, click MB2 on the lename and look at the minimum and maximum CDP that this le contributes to. We will use this to set the parameters for Stack 3D.
4 2036
This le contributes to CDPs 4 - 2036 This translates to Inlines 1 26. SelctGet All. 4. In the Normal Moveout select the velocity table that we built previously and set the stretch mute to 0.0 percent thus disabling it. 5. Apply the post-NMO mute picked previously in the Trace Muting ow. In Stack 3D, enter the minimum and maximum lines contributing to the input dataset. In this case the second le contributes to lines 1 through 26. 6. Since there will be several jobs running simultaneously we will need to reduce the memory usage for each job. Set the memory to 4 MB for both the input and output buffers. 7. In Trace Display Label, label this as the initial stack on the rst half of the project. 8. In Disk Data Output, output a stack with decon and elevation statics from Stack 3D.
Landmark
6-17
Run Stack3D on the Other Superswath 1. Copy the ow for the rst half stack to a new ow for the second half:
6-18
Landmark
Landmark
6-19
Stack Merge 3D
Enter name of host ----------------------------Restart with an existing stack? ------------------------------No Minimum inline number -------------------------------------------1 Maximum inline number -----------------------------------------42 Minimum crossline number---------------------------------------1 Maximum crossline number ------------------------------------79 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Size of input trace memory buffer (MB)-----------------------4 Size of stack trace memory buffer (MB) ----------------------4
6-20
Landmark
Alternative Stack Merge Method ------ DO NOT BUILD THIS FLOW Build the following flow as a comparison:
Stack Merge 3D
Restart with an existing stack? -----------------------------Yes Select existing stack lename -----------Stack -initial (1st) Subtract input from stack? -------------------------------------No Minimum inline number -------------------------------------------1 Maximum inline number -----------------------------------------42 Minimum crossline number---------------------------------------1 Maximum crossline number ------------------------------------79
The Subtraction option This option in the Stack Merge is intended as a means by which you can remove the effects of one (or more) bad trace(s) from an stack volume. This is actually more appropriate for merging DMO to
Landmark
6-21
Stack 3D volumes where a few of bad traces have inuenced many output traces. If bad traces are detected, the input process, Stack3D or DMO to Stack 3D can be rerun with these traces only. The resulting partial stack of the bad traces can then be subtracted from the total instead of rerunning each stack and merge ows to eliminate the bad input traces.
6-22
Landmark
Inline Displays The following exercise produces inline displays of the 3D stack volume. 1. Build the following ow:
Bandpass Filter
The default parameters will be adequate
Trace Display
Number of ENSEMBLES / screen ----------------------------10 Primary trace LABELING --------(ILINE_NO) 3D inline no. Secondary trace LABELING ---(XLINE_NO) 3D crossline 2. In Disk Data Input, input the initial stack (after merge) data volume. Sort the input with a primary sort of inline and secondary of crossline. Set the sort order to 1, 5-40(5), 42: */ 3. Apply a Bandpass Filter and AGC to make the stack look better.
Landmark
6-23
4. Set the Trace Display parameters. Select to plot 10 ensembles and set the primary and secondary annotations as ILINE_NO and XLINE_NO. This will plot too many traces per screen on the rst plot but Trace Display allows you to zoom and scroll. 5. Execute the ow.
Crossline Displays The following exercise produces crossline displays of the 3D stack volume. 1. Build the following ow: This ow is very similar to the Display inlines ow that you just built. You may want to copy that ow to save some work.
Bandpass Filter
The default parameters will be adequate
Trace Display
Number of ENSEMBLES / screen ----------------------------10 Primary trace LABELING -------(XLINE_NO) 3D crossline Secondary trace LABELING -----(ILINE_NO) 3D inline no 2. In Disk Data Input, input the initial stack (after merge) data volume.
6-24
Landmark
Sort the input with a primary sort of crossline and secondary of inline. Select to plot crosslines and inlines as follows: 1, 10-70(10), 79: */ 3. Apply a Bandpass Filter and AGC to make the stack look better. 4. Set the Trace Display parameters. Select to plot 9 ensembles and set the primary and secondary annotations as XLINE_NO and ILINE_NO. This will plot too many traces per screen on the rst plot but Trace Display allows you to zoom and scroll. 5. Execute the ow.
Landmark
6-25
Time Slice Displays The following exercise produces time slice displays of the 3D stack volume. 1. Build the following ow:
Trace Display
number of ENSEMBLES/screen ----------------------------- 16 Trace gap between ensembles----------------------------------3 Trace display MODE----------------------------------- Grayscale Primary trace LABELING ---------- slc_time (user dened) Secondary trace LABELING -----(ILINE_NO) 3D inline no 2. Set Time Slice Input parameters.
6-26
Landmark
Select to plot time slices every 100 ms between 200 and 1700 msec. This will produce 16 time slices. 42 79 Inline Number 1
Crossline Number
1 horizontal
Time Slice Display Orientation To match the display and a map view of this project, set the Horizontal axis to be Inlines Decreasing to the right and set the Vertical axis to Decrease, thus plotting Xline 79 on the top and inline 1 on the right. 3. Select the Trace Display parameters. Select 16 for ensembles to plot and set the Primary Annotation to slc_time (a user dened header) and the secondary annotation to ILINE_NO. You may also nd that using a Grayscale mode makes the display more visually appealing. Adding a larger gap between ensembles will also help visually separate the time slices.
Landmark ProMAX 3D Seismic Processing and Analysis 6-27
ProMAX 3D Viewer
The ProMAX 3D Viewer is a 3D data volume visualization tool. IT will allow you to view a post stack 3D data volume rendered in 3D space where Inlines, Crosslines and time slices can be viewed simultaneously. Box, or cube displays can also be generated. The ProMAX 3D Viewer can display seismic trace data and/or 3D velocity tables. Horizons, faults, and other interpretive entities cannot be viewed using the viewer, but they can be viewed given a full OpenVision license and access to the OpenWorks Oracle database. In this section you will learn how to start the viewer and add different components to the display. 1. Build and execute the following ow:
ProMAX 3D Viewer*
There are no parameters that need to be set for this standalone process. The program may promt you as to which monitor to display the viewer on and a second prompt will want to know where to display the data selection dialog boxes. By default the program will start on the higher level graphic screen and you may only get one screen selection
6-28
Landmark
opportunity. When the intialization is complete you should get a display similar to the following example:
2. Use the Data ProMAX pulldown menu to access seismic data and/or velocity cube data. Lets start with the Seismic Animator A Dialog box will appear asking you to choose a dataset to display and then several parameters about how to slice the cube. First, lets select the merged initial stack le. The remaining parameters suggest displaying all traces. You may want to decimate the stack volume to display a subset of the lines but all traces on each line. For this rst example just let the remaining parameters default and click on the APPLY button. One line should appear on the display. Repeat the process, but select to do a TRACE oriented display and then a time slice oriented display.
Landmark
6-29
Basic available functionality includes the ability to rotate the image by holding down MB2 and moving the mouse, changing the vertical scaling, and display zoom capabilities. Zoom
Vertical Scaling
6-30
Landmark
Some screen customization can also be performed by the various dialog boxes under the Options pull down menu. 3. Click on one of the entities on the display. You can move from one Line, Crossline, or Timeslice to another by holding down MB1 and dragging the cursor up and down in the screen. The green directional buttons at the bottom of the screen can also be used to move through the available data. 4. Add a Velocity model to the display by choosing the Data ProMAX Velocity Animator pull down option. Select the velocity le that you inported from the ASCII le earlier. Choose the BOX option and use the Rainbow color scheme with a 50 % transparency setting.
5. You can move the BOX edges by clicking on the side of the box and moving the cursor up and down the screen while holding down MB1. 6. Ask your instructor for more information as you play with the other options and buttons.
Landmark
6-31
3D Mix
The following exercise applies a running mix to the initial stack in order to produce an additional stack volume that looks different from the input. These two stack volumes will then be used to demonstrate the stack volume comparison procedure.
Apply a 3D Running Mix to the Initial Stack 1. Build the following ow: This flow is very similar to the Merge 3D stacks flow. You may want to copy that flow to save some work.
3D Mix
Select dataset ---------------------- stack - initial (merged) IN-LINE the X-LINE sort order--------------------------- *:*/ Trace mixing algorithm --------------------- Weighted Mix Exclude hard zeros?------------------------------------- Yes Trace weights for mixing ---------------------- 1.0,1.0,1.0 Number of traces to mix over------------------------------- 3 Type of trace edge taper ------------------ Fold edge back Application mode for mixed traces--------------- Normal Steer trace mix along a velocity dip?------------------ No Number of applications ------------------------------------- 1 Re-apply mutes after mixing --------------------------- Yes
6-32
Landmark
3D Mix is an input process that automatically sorts with a primary key of ILINE_NO and secondary key of XLINE_NO. Do not use Disk Data Input.
Select to process all inlines and all xlines (*:*/). Change the number of trace weights to 3 and use the weight values (1.0,1.0,1.0). 3. In Trace Display Label, label as initial (3dmix). 4. In Disk Data Output, output a stack with 3D Mix le. 5. Execute the ow.
Display the Mixed Stack You have already build a flow to display some inlines, crosslines and/or time slices from this volume. 1. Reselect your Display Inlines ow. Change the input lename and execute the ow. 2. Display some crosslines and time slices.
Landmark
6-33
6-34
Landmark
DS_SEQNO 1
DS_SEQNO 2
1 10 30 42
ILINE_NO
1 10 30 42
DDI and DDInsert (Merge) read inlines of interest from each volume ILINE_NO 1 1 10 10 30 30 42 42
DS_SEQNO
Inline Sort splits the ILINE_NO merged ensembles by input dataset sequence number, and then sorts by crossline ILINE_NO 1 1 10 10 30 30 42 42
DS_SEQNO
Landmark
6-35
3D Stack Comparisons
These exercises produce the following three flows: One compares inline stacks from different 3D volumes. One compares crossline stacks from different 3D volumes. One compares time slices from different 3d volumes.
Compare Inlines from Two Stack Volumes 1. Build the ow as outlined on the next page. 2. In the Disk Data Input, input the initial stack after merge. Sort using 3D inline as primary and 3D crossline as secondary. Select inlines 1, 5-40(5) and 42 and all 3D crosslines (*). 3. In Disk Data Insert, read the initial stack and merge it with 3D mix observing dataset boundaries but not forcing the datasets to merge. Select Merged for Insert mode. Use the same primary and secondary sort order as Disk Data Input and select the same 3D inlines. 4. Select the Inline Sort parameters. Select 3D inline number (ILINE_NO) for the primary sort key and Input dataset sequence number (DS_SEQNO) from the alternate list for the secondary sort key. Next, select 3D crossline number (XLINE_NO) for the tertiary sort key. Change the maximum number of traces in the ensemble to the maximum number of traces you have per inline (79). This sorts the data so that the same inlines from each volume are adjacent to each other for display. Finally, select Secondary for the Sort key to control end-of-ensemble so each inline appears as separate ensemble. 5. Select Filter and AGC parameters. Use the default parameters. 6. In Trace Display, display all selected lines for comparison.
6-36
Landmark
Select 20 ensembles per screen. Select ILINE and XLINE for the primary and secondary trace labeling header entries. After executing the ow, use the scrolling Zoom option to compare the lines. 7. You may elect to display 1 ensemble per screen and the save the screens for comparison.
Landmark
6-37
8. This ow is an extension of the Display Inlines ow, built earlier. You may want to copy that ow to save yourself some work.
Inline Sort
PRIMARY sort key ---------- (ILINE_NO) 3D inline number SECONDARY sort---------------(DS_SEQNO) Input dataset sequence number TERTIARY sort key----(XLINE_NO) 3D crossline number Maximum traces per output ensemble -------------------- 79 Number of traces in buffer ------------------------------------160 Buffer type ----------------------------------------------------Memory Sort key which controls End-of-Ensemble-----Secondary
6-38
Landmark
Disk Data Input Disk Data Insert Inline Sort Bandpass Filter
The default parameters will be adequate
Trace Display
Number of ENSEMBLES / screen ----------------------------20 Primary trace LABELING ---------(ILINE_NO) 3D inline no Secondary trace LABELING ---(XLINE_NO) 3D crossline
>Trace Display<
Number of ENSEMBLES / screen ------------------------------1
Compare Crosslines from Two Stack Volumes 1. Build the ow as outlined on the next page. 2. In Disk Data Input, input the Stack with elevation statics. Sort using 3D xline as primary and 3D inline as secondary. Select crosslines 1, 10-70(10) and 79 and all 3D inlines (*). 3. In Disk Data Insert, read the Mix Stack. Select After for Insert mode. Use the same primary and secondary sort order as Disk Data Input and select the same 3D xlines. 4. Select Inline Sort parameters. Select 3D crossline number (XLINE_NO) for the primary sort key and Input dataset sequence number (DS_SEQNO) from the alternate list for the secondary sort key. Next, select 3D inline number (ILINE_NO) for the tertiary sort key. Change the maximum number of traces in the ensemble to the maximum number of traces you have per crossline (42). This sorts the data so
Landmark
6-39
that the same crosslines from each volume are adjacent to each other for display. Finally, select Secondary for the Sort key to control endof-ensemble so each crossline appears as separate ensemble. 5. Select Filter and AGC parameters. Use the default parameters. 6. In Trace Display, display all selected crosslines for comparison. Select 18 ensembles per screen. Select XLINE and ILINE for the primary and secondary trace labeling header entries. After executing the ow, use the scrolling Zoom option to compare the lines. 7. You may elect to display 1 ensemble per screen and the save the screens for comparison.
6-40
Landmark
This flow is an extension of the Display Crosslines flow, built earlier. You may want to copy that flow to save yourself some work.
Inline Sort Bandpass Filter Automatic Gain Control Trace Display >Trace Display<
Landmark
6-41
Bandpass Filter
The default parameters will be adequate
Trace Display
Number of ENSEMBLES / screen ----------------------------18 Primary trace LABELING ------ (XLINE_NO) 3D crossline Secondary trace LABELING ---------(ILINE_NO) 3D inline
>Trace Display<
Number of ENSEMBLES / screen ------------------------------1
Compare Time Slices from Two Stack Volumes 1. Edit the Display Time Slices ow. It is not as convenient to compare the time slices as with the inlines and crosslines. The most effective way is to execute the same flow twice
6-42
Landmark
with two different input files and then size the windows as appropriate to compare the two volumes.
Trace Display
number of ENSEMBLES/screen ----------------------------- 16 Trace gap between ensembles----------------------------------3 Trace display MODE---------------------------------- Grayscale Primary trace LABELING ------------------------------- slc_time Secondary trace LABELING -----(ILINE_NO) 3D inline no
Landmark
6-43
6-44
Landmark
Chapter 7
Landmark
7-1
7-2
Landmark
Landmark
7-3
This ow is virtually identical to the ow that we used to pick the rst break mute and decon design gate. You may want to copy the pick parameter tables ow to save some work.:
Trace Display
Number of ENSEMBLES/screen -------------------------------1 Trace Display MODE ---------------------------------------WT/VA
7-4
Landmark
7. Zoom inside Trace Display from 0 to 600 ms on the trace for the rst cable to get a more detailed look at the rst breaks.
8. Use the First Break Picker Set Neural Network Parameters... pulldown menu. Select a phase for the picker and a signal to noise window length. (A peak and a 100 ms. time window should work ne.)
9. Click OK. 10. From the FirstBreakPicker Create Training Data Set... pulldown menu. You will be prompted to enter a description name for a miscellaneous time gate to guide the NN picker. For example, you
Landmark
7-5
can use a name similar to NN GATE to distinguish this time gate from any other miscellaneous time gates. 11. Click OK. 12. Choose to interpolate the picked times as a function of AOFFSET (the default). A window appears with two tables present. The highlighted, active table is called FB Training Data and the other refers to the above miscellaneous time gate. The active table contains example rst arrival picks for the Network Training.
NN FB Picking Open Tables 13. Pick some example rst breaks. These picks are used by the NN FB Pick Trainer as the example picks for learning. Manually pick some rst breaks on this cable of this shot using MB1. (These picks are only retained in memory.) You do not need to pick the entire shot. 14. Use MB3 to snap the picks to the nearest peak, trough, or zero crossing. Snap to the same phase as was selected in the NN parameter window earlier (PEAK). 15. Select the time gate from the parameter table selection window
7-6
Landmark
This is not from the picking pull down menu. The gate is already open in the table selection window. This will help guide the neural network picker when it attempts to locate the rst break picks. Make the top of the time gate parallel to the trend of the rst breaks, and put it about two wavelengths prior to the actual rst arrivals. Click MB3, New Layer, and add the bottom part of the gate. Again, position this gate to lie about two wavelengths after the rst arrivals
Note: The neural net algorithm is very sensitive to the slope of the top gate, so make sure your slope is parallel to the slope of your rst breaks.
16. Use the FirstBreakPicker Neural Network Training... pull down menu and enter a name to assign to the weight table. For example use a name likewt1. 17. Click OK.
Landmark
7-7
The window will disappear for a few seconds. When the screen refreshes the training will have completed. 18. Use the FirstBreakPicker Neural Net Recall Continuous Recall... pull down menu. This option will repick this shot record using the information in the selected weight le. You will be prompted to input a descriptive name for the generated picks. These picks were made from interactive training in T.D. using wt1. You will also be asked for 3 numbers:
NN Recall for Testing The maximum trace to trace static, a value of 20 is reasonable. The offset to start picking at, a value of 1400 ft is good. The number of traces to use for a line t. Since our picks are not very good, a fairly large number, about 11, may help the picker.
19. Click OK. The program is now picking all of the traces on this shot based on what it learned during the training.
7-8
Landmark
Do not worry if all the picks are not good; they wont be. This is a very difficult record to pick.
NN Testing Picks You should also notice a new parameter table in the list of open tables with the same name as you assigned in the previous window. 20. Click Trafc Light icon to go to the next shot ensemble. 21. From the First Break Picker pulldown menu, select FirstBreakPicker Neural Net Recall One time recall... The current shot ensemble is now picked. At this point you may elect to retrain the network using additional information. To do so, make sure that you select to edit the FB Training Data Parameter table, make more training picks, and then retrain the network. 22. From the pulldown menus in Trace Display, select File Exit / Stop ow and exit from the Trace Display and stop the ow. 23. Select Yes, to save your picks. Most importantly, we need to save the weight tables. The Neural Network picks are useless, since we only picked a few shots and the training picks are only in memory. The time gate can be saved for future reference if required.
Landmark
7-9
7-10
Landmark
4. Some preprocessing may be necessary, such as trace edits, ltering, or scaling. Apply the same preprocessing that you applied to the input of the NN First Break Pick Training in Trace Display. 5. Select the NN First Break Picker parameters. Enter the FBWEIGHT le you generated in the previous ow and your rst break time gate. 6. Set the value for number of traces in median line t. This parameter establishes a local slope of the rst break picks by specifying the number of previous rst break picks to t a line. (In the interactive picking in Trace Display we used 11 traces) 7. Enter a starting offset for the picker. Enter an offset with good signal-to-noise which doesnt exhibit shingling of refractors. For this data, an XDBoffset value of about 1400 ft. is adequate. (We used 1400 ft. in the interactive picking in Trace Display.) 8. Execute the ow. 9. Once the picker has nished, globally QC your rst break picks by using a database 3D point cloud display. First go to the line level and start DBTools by clicking Database. Then, pull down Database XDB Database Display. Click 3D and generate a PointCloud display from the TRC order using SIN, OFFSET, PICK001.
Landmark
7-11
The display should illustrate if your picks are consistent. The cursor tracker can be used to find the SIN numbers which have anomalous picks.
QC Plots of the FB Picks 10. You may also nd an XYgraph to be useful. Click 3D and generate an XYgraph display of TRC: OFFSET, PICK0001, SIN
7-12
Landmark
Using the First Break Pick Macro for QC This exercise allows you to plot the first breaks on the trace data for QC and editing. 1. Modify your previous ow to build the following ow:
>Disk Data Input< >Disk Data Insert< >True Amplitude Recovery< >Automatic Gain Control< >Trace Display< >NN First Break Picking< First Break Pick Macro*
Select dataset ---------------------------- shots - with geom (1) Trace read option ------------------------------------------------ Sort primary trace header entry -----------------------------------SIN secondary trace header entry --------------------------- NONE sort order for dataset --------------------------------------------- */ Database parameter ------------ TRC NN_PICK PICK0001 Bulk shift static --------------------------------------------------- 25. Specify LMO velocity --------------------------------- 1001:9000 Specify END time ----------------------------------------------- 200. Number of display panels ---------------------------------------- 5 Trace scaling option ----------------------------------- Individual ------- remaining parameters can default------2. Toggle everything inactive and add the First Break Pick Macro. 3. Execute the ow. 4. Use the Picking Edit Header Values (rst breaks)... pull down menu select the FB_PICK (First Break pick time) from the trace header to overlay on the shot records. Edit the picks as required. (Dont forget to snap.)
Landmark
7-13
ASCII Import of First Arrival Times 1. Open an XDB Database display window. Select ASCII Get to work with an existing ASCII le.
7-14
Landmark
3. After the plot is complete, click Cancel to dismiss the import window. 4. Save the attribute to the database using the Database Save option.
QC the Imported Picks This exercise quality controls the imported values. 1. Generate a 3D: Pointcloud plot of the imported picks using SIN as the X axis, OFFSET as the Y axis and plot NNPKDANG on the Z axis. 2. You may also nd an XYGraph to be of some use. Generate a 3D: XYGraph: TRC, OFFSET, NNPKDANG, SIN
Landmark
7-15
7-16
Landmark
Chapter 8
Landmark
8-1
8-2
Landmark
Offset Range from First Break Pick Plot 1. From XDB generate a 3D: XYgraph: TRC : OFFSET, NNPKDANG, SIN and measure the offset value for the consistent refractor. Values of 600 to 3000 ft. seem reasonable.
3000 ft
600 ft
Offset Range from Trace Display 1. Re-run the ow that we used to pick the parameter tables. In this ow we selected 5 shots and displayed them as a function of offset. From this display you can select an offset range over a consistent refractor.
Landmark
8-3
Database Values This process writes several attributes to the SIN and SRF databases. XPREDICT - predicted X coordinate YPREDICT - predicted Y coordinate DPREDICT - Distance for survey data VPREDICT - Velocity estimate over offset range selected APREDICT - azimuth from survey to predicted coordinate TPREDICT - average delay time
8-4
Landmark
Analyze the Results using Simple Plots 1. From the database, generate a simple plot of SIN:XPREDICT and XCOORD. 2. Repeat the plot for YPREDICT and YCOORD. 3. Repeat the previous plots from the SRF database. 4. You may also look at a graph of DPREDICT for the shots and receivers.
Analyze the Results using 3D Plots 1. From the database, plot a 3D: XYGraph: SIN: XPREDICT: YPREDICT: DPREDICT. This plot shows the predicted x,y coordinates for each shot location. These are color coded by the distance that they were moved, relative to the original x,y. You can open the color bar to see which have moved the farthest. 2. Using the Display Source Control Points Black pull down menu, overplot the source control points. 3. Using the Views pull down menu and the distance measure option from the Double-fold icon, you can identify which shots have been moved. Make the DPREDICT plot dominant by using the Views Transparent Sin based Posting of DPREDICT pull down menu, and then measure the distance from the Xpredict-Ypredict point to the original shots using MB3 from the double fold icon. Find the shot that is the correct distance from the new location. Make the Shot Post plot current, and identify the station location. 4. This same sequence can be used for the receivers. From these plots you can isolate possible candidate shots and receivers needing attention. You might want to review the rst break picks on these shots and possibly re-examine them using Graphical Geometry QC.
Landmark
8-5
If an x,y is actually incorrect in the database, you can correct the appropriate shot or receiver and then re-bin and reload the trace headers.
8-6
Landmark
Chapter 9
Refraction Statics
This section covers the steps for calculating and applying 3D refraction statics. First break picks are required as input into this process. ProMAX offers two approaches to calculate refractions statics. The rst approach, consisting of 3D Refraction Statics Model, Inversion, and Computation processes, works well on smaller surveys with good rst breaks while the second, Refraction Statics Calculation, works well on large surveys, tolerates poorer rst break picks, and does not require each shot to have rst break picks. We rst use the 3 module suite, 3D Refraction Statics Model, Inversion and Calculation approach to generate a solution and then run the single Refraction Statics Calculation process to generate a second solution. We use the refractor offset le picked and saved using 3D Refraction Statics Model from the rst exercise as input to Refraction Statics Calculation in the second exercise. Then, we save the original elevation statics using XDB Database Display, and apply our refraction statics using Datum Statics Apply. Finally, we will generate a stack with refraction statics applied and compare it to our elevation statics stack.
Landmark
9-1
9-2
Landmark
First break picks associated with that shot are displayed in the other window. You have the ability to capture the picks from several shots in the first break pick window. You can then draw straight line segments through the picks to identify the different refractor layers. This will identify the offset range associated with each refractor and provide initial estimates of refractor velocity and intercept time. You can repeat the procedure in other areas of the survey, creating an initial model. The output model values are stored in the database. 3D Refraction Statics Inversion The results of the 3D Refraction Statics Modeling routine can be used as an initial estimate for the 3D Refraction Statics Inversion process. Three different inversion methods are available: conjugate gradient, algebraic reconstruction technique (ART), and back-projection. Conjugate gradient and algebraic reconstruction technique are iterative methods that produce delay times and refractor velocities. Conjugate gradient holds all the picks in memory at once; therefore, it will run out of memory for large 3D surveys. However, it is the faster and produces better results for noisy data than the Algebraic Reconstruction Technique. ART only holds a portion of the first break picks in memory; therefore, it is used for any size 3D survey. Back projection is a single step inversion process that is good for estimating long wavelength variations in delay times. It generates a good updated initial model for the other two inversion methods. 3D Refraction Statics Computation The 3D Refraction Statics Computation process converts the delay times and refractor velocities generated by the inversion process into source and receiver static corrections. The static is computed as a vertical traveltime to a fixed elevation datum.
Landmark
9-3
Good as input to other methods Algebraic Reconstruction Technique Iterative Can be used for large datasets: uses little memory Slow
Resolves short and long wavelength in delay times well for clean data Conjugate Gradient Iterative Fast
Resolves short and long wavelength variations in delay times well for clean and non clean data
9-4
Landmark
3D Refraction Statics Model 1. Create the initial model by moving the cursor in the plan view window. Click MB2 on a few shots in the 3D survey to select and capture rst break picks that will be interpreted for velocity and offset ranges.
Note:
This project is very small. The recommended approach here is to select about 5 shots. One in each corner and one in the center.
2. In the rst break pick window, look at the options pulldown by There are three options that can be used to draw a straight line through a set of selected picks. The least square option (L2) is a good
Landmark
9-5
choice. Select Options L2. 3. Select the offset range of interest for one refractor by clicking MB1 at the near offset and the MB2 ant the far offset. A line will be drawn between them which is the Least Squares t of all the points in the selected offset range.
MB1
MB2
Offset Range Picking 4. Save the offset range and velocity information by clicking MB3. Notice the large asterisk appearing in the left hand screen at the center of gravity of all the previously selected shots. 5. Exit and save the information to the database using the File Save to Database and Exit pull down menu from either window. This builds an offset range and initial refractor velocity vs. shot location parameter table.
QC the Delay Times This exercise helps you review the quality of the delay times.
9-6
Landmark
1. Select 3D Refraction Statics Model parameters to QC the delay times and execute the ow:
Landmark
9-7
3. Exit, but do not save the information to the database using the File Exit without saving pull down menu from either window.
9-8
Landmark
3D Refraction Statics Inversion Parameterization 1. Toggle off 3D Refraction Statics Model and toggle on 3D Refraction Statics Inversion:
Landmark
9-9
for the refractor velocity. The program uses these estimated velocities as the initial refractor velocity in the inversion computation. 5. Select the default for refractor cell width and height, 1000 ft. by 1000 ft. The refractor cell width and height refer to dimensions for a rectangular grid of cells. The process looks for the refractor velocity in this grid. Since the refractor velocity varies slowly, compared to the weathering velocity, a cell size several times greater than the station spacing is recommended. 6. Set the maximum number of iterations to 75. This parameter provides an upper limit of iterations for the conjugated gradient and ART techniques. If the solutions have not converged (changing by less than one tenth of a ms) within this number of iterations, the program will stop. 7. You can run this using the interactive QC capabilities but in general, running automatically should be adequate. Your instructor may demonstrate the interactive mode. But the nal run will be done using the batch mode. 8. Execute the ow. The source and receiver OPFs will contain the output of 3D Refraction Statics Inversion. You can manipulate this information before computing refraction statics, using the available database math tools. For example, you can smooth the refractor velocities. For 3D smoothing of the surfaces you would need to run the
9-10
Landmark
3D Refraction Statics Computation Parameterization 1. Toggle 3D Refraction Statics Inversion off and the 3D Refraction Statics Computation on:
>3D Ref Statics Model*< >3D Ref Statics Inversion*< 3D Ref Statics Computation*
Number of layers------------------------------------------------------1 Compute V0 from UPHOLE data-----------------------------Yes Layer 1source delays ------- SIN GEOMETRY DLY10000 Layer 1receiver delays -----SRF GEOMETRY DLY10000 Refr 1 velocity at sources----SIN GEOMETRY VEL10000 Refr 1 velocity at rcvrs-------SRF GEOMETRY VEL10000 Final datum elevation----------------------------------------1400. Replacement velocity -----------------------------------------9000. 2. We are working with a 1 layer model and will compute the shallow (or V) velocity from the uphole times. 3. Enter the inversion solution output delay times and velocities for the single layer refractor layer. 4. Make the elevation datum and refractor velocity the same as used in Apply Elevation Statics. Elevation = 1400 ft. --------- Velocity = 9000 ft./sec 5. Execute the ow. 6. View the static corrections in the database using XDB Database Display. These statics corrections are labeled S_RFSTAT and R_RFSTAT in
Landmark
9-11
the source and receiver ordered database les. You can plot the S_STATIC and the S_RFSTAT simultaneously to compare the two sets of statics. Similarly you can plot R_STATIC and R_RFSTAT at the same time. Simple plots of S_STATIC vs. S_RFSTAT and R_STATIC vs. R_RFSTAT are also useful.
9-12
Landmark
The principle disadvantages are that since it is disk-based it is also slower and that it does not offer any graphical displays to help you pick parameters or control quality. However, other ProMAX tools will help us perform these functions. The source and receiver static solutions are applied to the data in a future step, Datum Statics Apply. As a part of this exercise you will see that there are four ways to enter the refractor offset ranges. These are: User type in-manually enter SIN and refractor offset values. Mute le-input a Top Mute le picked in Trace Display. Offset le-use the same offset table that was picked in the Refraction Statics Model ow (rst exercise in the chapter). Compute offsets-allow the software to compute offsets using a line segment t algorithm.
In this exercise you will use first-break pick times to calculate a nearsurface model and travel-time corrections. This process calculates shot
Landmark
9-13
and receiver refraction statics to shift to the final datum and updates the database. Results of this exercise applied in the next exercise. 1. Edit the current refraction statics ow as follows:.
9-14
Landmark
2. Select Refraction Statics Calculation parameters. Select the rst break time to use for the statics decomposition. These time picks will be in the TRC OPF and will normally be of the type NNPICK. Select the picks that we read in from the ASCII le, the NNPKDANG picks. Enter the number of layers to model, in this case use one layer. The identication number will be 1 for the rst run through the process. The shooting geometry is 3D. There are six steps to Refraction Statics Calculation* described in the menu. They may all be turned on for refraction statics computation or you may select to run one option at a time and view the output in the database. 3. INPUT V0 and REFRACTOR OFFSET. In this exercise well compute V0 from uphole times and choose the refractor OFFSET le that we picked when we performed the Modelling step. Three database entries are created in the SIN OPF: SIN REFR_OFF OFFPSS11 ---Near offset of refractor. SIN REFR_OFF OFFPSE11 ---Far offset of refractor. SIN VELOCITY V0INIT11 ----Weathering Velocity.
4. COMPUTE REFRACTOR VELOCITIES With this subheading turned on a refractor velocity is calculated based on the rst break times and the offset range from the previous step. Although you can smooth the velocity model in the menu, you may wish to look at your model in the database before smoothing. You could then either smooth in the database (useful to see immediate results of smoothing), or dene a smoother in the menu. There is also an option to edit the rst break picks automatically by setting a deviation from the median velocity described by the offsets. If any picks deviate more than the selected amount they will be killed, and set to NULL in a new rst break picks database le TRC F_B_PICK FBPEDITX, where X is the run identication number. Only the good picks will be included in this le. Remember to
Landmark
9-15
examine this edited le. Three database entries are created. CDP VELOCITY VCINIT11 -- CDP velocity for 1st refractor. SIN VELOCITY VSINIT11 ----Source velocity for 1st refractor. TRC F_B_PICK FBPEDIT1 ----Edited rst break pick le.
These database attributes may be edited. 5. COMPUTE DELAY TIMES Once CDP velocity is available, delay times for shots and receivers may be computed. This is done by iteration, starting with source delay time estimates, followed by receiver delay time estimates, and (optionally) nalized by CDP velocity updating. Values are not computed for any SIN, SRF or CDP that does not meet the minimum fold (menu parameter) criterion. Once the decomposition is complete for each refractor, these missing values are interpolated based on X and Y. Three database entries are created. SIN DELAYTIM SDELAY11----Source Delay times. SRF DELAYTIM RDELAY11--- Receiver delay times. CDP VELOCITY VCFIN011---- Final CDP velocities.
6. COMPUTE REFRACTOR DEPTH MODEL The depth model stage inputs delay times and refractor velocities in CDP, interpolates refractor velocity into SIN and SRF, computes a depth model for sources and another for receivers. Optionally, the rst refractor depth in SRF may be projected into CDP, smoothed, projected back into SRF, V0 recomputed in SRF based on the smoothed depths, new V0 projected from SRF to SIN, and nally
9-16
Landmark
SIN and SRF depth models computed. Six database entries are created. SIN REFDEPTH SDEP_011---Source Refractor Depth. SIN VELOCITY VSFIN011 -- Final Source velocity for 1st refractor. SIN VELOCITY V0FIN011 ---Final Weathering Velocity. SRF REFDEPTH RDEP_011--Receiver Refractor Depth. SRF VELOCITY VRFIN011- - Final Receiver velocity for 1st refractor. SRF VELOCITY V0FIN011 ---Final Weathering Velocity.
7. COMPUTE SOURCE AND RECEIVER STATICS The statics computation stage inputs refractor velocities and refractor depths, computes source and receiver depths the FINAL datum of 1400 feet and outputs static values. We have the choice of inputting a constant velocity or the bottom refractor velocity. For this exercise choose a user specied value of 9000 ft/sec. Two database entries are created. SRF GEOMETRY RSTAT00X --- Receiver statics. SIN GEOMETRY SSTAT00X ----Source statics.
Landmark
9-17
Copy S_STATIC to ORGSSTAT This exercise copies the S_STATIC. 1. From the Database window, select to produce a simple plot from the Database Get option. 2. Produce a plot of S_STATIC from the SIN database. 3. Select to generate a new plot by copying the S_STATIC plot. THis can be done suing the New Copy pull down menu.
4. Enter a name such as ORGSSTAT (Original Shot Static) and change the description if desired. 5. Click OK.
9-18
Landmark
Copy R_STATIC to ORGRSTAT This exercise copies the R_STATIC. 1. From the Database window, select to produce a simple plot from the Database Get option. Produce a plot of R_STATIC from the SRF database. 2. Select to generate a new plot by New Copy copying the R_STATIC plot. Enter a name such as ORGRSTAT (Original Receiver Static) and change the description. 3. Save the new attribute using the Database Save option.
Landmark
9-19
Datum Statics Apply The Datum Statics Apply program uses refraction statics computed by the 3D Ref Statics Computation* or Refraction Statics Calculation* processes, along with C_STATIC from the database to compute NMO_STAT and FNL_STAT. Whenever statics are applied, these header entries are updated and the integer multiple of a sample period portion of NMO_STAT is applied to the trace and the remainder is written to NA_STAT to be applied later. It is important to note that the Datum Statics Apply process first checks to see if other statics have been applied to the traces by an earlier processing step. If statics have been applied, Datum Statics Apply first removes these statics returning the traces to their original recorded time reference. Also, if previous statics contained any hand statics or shot delay corrections, these statics are also removed and should be reapplied. The Datum Statics Apply process will be incorporated into the stack flow in the next exercise in order to reduce the time spent reading the trace datasets.
9-20
Landmark
As previously discussed, we apply the datum statics with the refraction statics solution in the stack flow to avoid re-reading the shot data multiple times. Reproducing the traces and splitting the flow allows us to condition the data for residual statics while we avoid re-reading the shot data yet another time. The shot data to be used in Residual Statics is resampled to an 8 msec sample period and saved in 8 bit format to reduce its size and improve performance. Next, we stack the other copy of the traces and output the refraction statics stack. We will then repeat these steps for the second half of the dataset and in the next exercise merge the stacks and compare the merged refraction statics stack with the merged elevation statics stack.
Landmark
9-21
1. Copy the initial stack ow for the rst half of the data and edit it to create the following ow:
Disk Data Input Datum Statics Apply Trace Display Label Disk Data Output Normal Moveout Correction Trace Muting Reproduce Traces SPLIT Bandpass Filter Automatic Gain Control Resample/Desample Disk Data Output END_SPLIT SPLIT Stack 3D Trace Display Label Disk Data Output END_SPLIT
9-22
Landmark
Landmark
9-23
Trace Muting
SELECT mute parameter le ----- post nmo mute (brute)
Reproduce Traces
Trace grouping to reproduce -----------------------Ensembles Total number of datasets -----------------------------------------2
SPLIT
Trace selection MODE ------------------------------------ Include Primary trace header - REPEAT (REPEATED ensemble) Secondary trace header ----------------------------------- NONE SPECIFY trace list --------------------------------------------------1/
Bandpass Filter
The default parameters will be adequate
Resample/Desample
Output sample rate -------------------------------------------------8. All other input variables ------------------------------DEFAULT
END_SPLIT
6. In Normal Moveout Correction, select the velocity table that you imported and set the stretch mute percentage to 0.0, thus disabling it.
9-24
Landmark
7. In Trace Muting, apply the post-NMO trace mute that you picked previously. 8. Make 2 copies of each ensemble using Reproduce Traces. 9. SPLIT the ow passing the rst REPEATED copy. 10. Bandpass lter and AGC the data using the default parameters. 11. Resample the data to an 8 ms sample period. 12. Output the conditioned, prestack dataset in 8 bit format. Use a dataset lename like shots-input to correlation. 13. End the split.
Landmark
9-25
SPLIT
Trace selection MODE ------------------------------------ Include Primary trace header - REPEAT (REPEATED ensemble) Secondary trace header ----------------------------------- NONE SPECIFY trace list --------------------------------------------------2/
Stack 3D
Enter name of host ----------------------------Operating system of host----------------- (as per instructor) Restart with an existing stack? ------------------------------No Minimum in-line number ------------------------------------------1 Maximum in-line number ----------------------------------------26 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Apply nal datum statics after stack? -------------------Yes Size of input trace memory buffer (MB)---------------------- 4 Size of stack trace memory buffer (MB) ----------------------4
END_SPLIT
14. Split the ow again and pass the second REPEATED copy through the stack path. 15. In Stack 3D, set the minimum and maximum lines to be contributed to by this input dataset. (This should already be set from the original ow to lines 1 - 26.) 16. In Trace Display Label, identify this as the refraction statics stack for the rst half of the project.
9-26
Landmark
17. In Disk Data Output, output a stack with refraction statics for the rst half of the project. 18. End the SPLIT. 19. Execute the ow.
Landmark
9-27
Run Stack3D on the Second Superswath 1. Copy the ow for the rst half stack to a new ow for the second half:
Datum Statics Apply Normal Moveout Correction Trace Muting Reproduce Traces SPLIT Bandpass Filter Automatic Gain Control Resample/Desample Disk Data Output
Output Dataset Filename --temp-input to correlation (2)
End Split
2. Change the Disk Data Input le name to the shot organized le for the second half of the project.
9-28
Landmark
3. Change the rst Disk Data Output to write a le of shot organized data after application of the refraction statics for the second half of the project. 4. Change the le name in the second disk data output to reect that the data is the input to the correlation process for the second half of the project. 5. Edit the Stack 3D parameters to reect the inlines that are contributed to by this input dataset. In this case the second le contributes to lines 16 through 42. 6. Change the Disk Data Output le name to a refraction statics stack le for the second half of the project. 7. Execute the ow.
Landmark
9-29
9-30
Landmark
Stack Merge 3D
Enter name of host ----------------------------Operating system of host----------------- (as per instructor) Restart with an existing stack? ------------------------------No Minimum in-line number ------------------------------------------1 Maximum in-line number ----------------------------------------42 Maximum x-line number -----------------------------------------79 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Size of input trace memory buffer (MB)---------------------- 4 Size of stack trace memory buffer (MB) --------------------- 4
Landmark
9-31
9-32
Landmark
Chapter 10
Landmark
10-1
Trace Statistics
Utilizing the time information from previously picked first breaks, Trace Statistics calculates up to eight different statistics for each input trace. These statistics include: TRC_AMPL: average trace energy FB__AMPL: average rst break energy PRE_FB_A: average pre-rst break energy PRE_FB_F: average pre-rst break frequency T_SPIKES: Spikiness: the ratio of maximum magnitude sample to trace signal amplitude FRQ_PEAK: dominant frequency based on a count of zero crossings within a signal window FRQ_WIDE: frequency deviation based on statistical scatter of frequency estimates AMPDECAY: estimated trace energy decay rate in db
In this exercise you will try to identify bad traces using Trace Statistics. Based on the values computed for each trace, you will edit the data volume to remove abnormal traces.
10-2
Landmark
Database/Header Transfer
Direction of transfer ------ Load TO trace headers FROM Number of parameters ---------------------------------------------1 First database parameter--- TRC NN_PICK NNPKDANG First header entry --------(FB_PICK) First break pick time
Trace Statistics
Types of trace statistics to compute ----------- TRC_AMPL -------------------FB_AMPL PRE_FB_A PRE_FB_F T_SPIKES -----------------------------FRQ_PEAK FRQ_WIDE AMPDECAY Use rst breaks or time gate------------------ FIRST BREAK Form of statistic output------ DATABASE and HEADERS
Landmark
10-3
Ensemble Statistics
Seismic properties of the traces may be more pronounced when viewed from an ensemble based perspective. In this exercise, you will calculate ensemble statistics in a few different domains and find the domain which display potential problems the best. 1. Edit the existing ow to calculate Ensemble Statistics to possibly identify anomalous data in the SIN and/or SRF domains.
>Disk Data Input< >Disk Data Insert< >Database/Header Transfer< >Trace Statistics< >Disk Data Output< Ensemble Statistics* Ensemble Statistics* Ensemble Statistics* Ensemble Statistics*
2. Input a trace statistic such as, TRC_AMPL, or FRQ_PEAK to help identify and describe the noisy traces. 3. Highlight SRF, SIN, and CDP domains to calculate average/mean statistics. 4. Repeat for other statistics. Only one statistic can be averaged per program execution. 5. Execute the ow. 6. Exit the ow and select Database from the global parameter menu. Select Display Get, and view the ensemble statistic information in the SIN, SRF, and CDP OPFs.
10-4
Landmark
Analyze the Results 1. Exit the ow and select Database from the global parameter menu. Using DBTools go to the TRC order and double click on some of the statistical attributes. Note ranges of vales for each statistic for which you might elect to kill the traces. 2. Build a ow which will use trace statistical entries to kill the bad traces as outlined on the next page. 3. In Database/Header Transfer, select to move data attributes from the TRC OPF to the trace headers. For the trace header entry, select user dened and use the same name as the database attribute. You can transfer more than one set of values in the same step. 4. Select Trace Kill/Reverse parameters to kill the noisy data. For example, if you use FRQ_PEAK, select FRQ_PEAK as the primary trace header and in the trace list include 65-80. Select one primary trace header for each Trace Kill/Reverse.
Landmark
10-5
Reproduce Traces
Trace groupings to reproduce -------------------- Ensembles Total number of datasets -----------------------------------------2
IF
Trace selection MODE ------------------------------------ Include Primary trace header--------- REPEATED ensemble copy Secondary trace header-------------------------------------NONE SPECIFY trace list ----------------------------------------------------1
Trace Kill/Reverse Trace Kill/Reverse Trace Display Label ELSEIF Trace Display Label ENDIF Trace Display
10-6
Landmark
Disk Data Input Disk Data Insert Reproduce Traces IF Trace Kill/Reverse
Trace editing MODE --------------------------------------------- Kill Get edits from DATABASE ------------------------------------- No PRIMARY edit list header word --------------------- t_spikes SECONDARY edit list header word ------------------- NONE SPECIFY traces to be edited --------------------------100-300
Trace Kill/Reverse
Trace editing MODE --------------------------------------------- Kill Get edits from DATABASE ------------------------------------- No PRIMARY edit list header word --------------------- trc_ampl SECONDARY edit list header word ------------------- NONE SPECIFY traces to edit ---------200000000-1500000000
ELSEIF
Trace selection MODE ------------------------------------ Include Primary trace header--------- REPEATED ensemble copy Secondary trace header-------------------------------------NONE SPECIFY trace list ----------------------------------------------------2
Landmark
10-7
View the edited data with Trace Display. Check that the noisy data has been edited and you have not edited too many near offset traces.
Example of shot record before and after edits The header plot of TRC_TYPE is a quick visual aid to tell you which traces have been edited. All traces with TRC_TYPE =2 were killed by the Trace Kill process.
NOTE: In this example, the values for the editing were exaggerated for demonstration purposes.
10-8
Landmark
Landmark
10-9
Trace Display
Trace scaling option ------------------------------ Entire Screen 2. Read in the combined shots le using SORT mode and turning IDA on. 3. Display 1 shot per screen computing one global scaler for all traces. This may help visualizing anomalous traces easier.
Start DBTools and generate two displays 1. Exit from the ow and start DBTools by clicking on the Database option from the global menu. bar. 2. Generate a shot location map using the View Predened Source Fold Map option. Change to a monochrome color scheme and select a color from the color palette.
10-10
Landmark
TRC_AMPL, PRE_FB_A, TSPIKES, AMPDECAY and FRQWIDE and the axes, color code and histogram respectively.
Landmark
10-11
4. Re-orient the display using the manual axis rotation icon so that the T_SPIKES axis is coming directly out of the screen.
In this conguration you can see a relationship between the traces that have both high amplitudes before the rst arrivals and also a high overall RMS amplitude. These traces can be isolated using the polygonal selection icon. 5. Select a polygon that encompasses the anomalously high amplitude traces on each axis simultaneously.Remember to double click to close the polygon. 6. Project these points to the SIN domain and see that the shots that contain these traces are highlighted on the shot location map.
10-12
Landmark
7. From the shot location map, PD, using the bow and arrow icon, only the shots of interest to the Trace Display. Page through the shots and look for the anomalous traces by combining the header plot and by looking at the trace amplitudes.
8. Open a trace kill table in Trace Display and start adding traces to be killed to the kill list. 9. The DBTools displays can be re-oriented to continue the analysis by cross plotting any combination of attributes.
Landmark
10-13
10-14
Landmark
Chapter 11
3D Residual Statics
There are two external model correlation autostatics processes: Cross Correlation Sum and Gauss-Seidel. Both of these processes compute surface consistent statics based on trace correlations with externally built stack model traces. These trace correlations are generated by External Model Correlation. The correlation pick times are written to the TRC database. The individual cross correlation traces are written to an output data le. As an option to using the Cross Correlation Sum and Gauss-Seidel surface consistent statics decomposers, the correlation pick times can be applied directly to the data as trim statics. A typical 3D reection residual statics job ow consists of ve steps: model building correlation gate picking correlation computation statics computation statics application
Landmark
11-1
F-XY DECON
The default parameters will be adequate
Bandpass Filter
The default parameters will be adequate
Resample/Desample
Output sample rate -------------------------------------------------8. All other input variables ------------------------------DEFAULT
3. In F-XY Decon, use the default values for all parameters. 4. Add a Bandpass Filter and AGC to the ow. In this case, we will want to add some cosmetic poststack processing so the frequency content and amplitude of the model traces match those of the input prestack data. The default parameters will be adequate. 5. Add the Resample process to the ow. It is fairly common to band limit the data for correlation purposes in the residual statics sequence. Since the high frequencies have been limited you can usually resample the data to an 8 msec sample rate without losing any information. 6. Add a label to the trace headers identifying this data as being the output from FXY decon. 7. In Disk Data Output, enter an output dataset name describing this data as being the output from FXY decon and built specically as an External model. Since the data is scaled we should be able to store that amplitudes as 8 bit numbers to conserve disk space. 8. Display a few lines to QC the model dataset using your Display Inlines ow. Your Inline comparison ow will not work since there is a difference in sample periods.
Landmark
11-3
Xline number
Inline number Randomly picked Autostatics Horizon points Given that you have picked horizon times at these locations, only the area that falls within the outer polygon defined by these points will be correlated.
11-4
Landmark
Xline number
Inline number
Area that would be Correlated Traces falling within the limits of the outer polygon, defined by the picked locations, are correlated. Therefore, it is critical, for all CDPs to fall within the polygon. This is accomplished by picking the first and last CDPs on all picked lines and to pick the first and last lines.
Landmark
11-5
Xline number
Inline number Absolute minimum requirements Given that autostatics horizon control points were picked at the beginning and end of a series of lines, the entire CDP range is surrounded by the outer polygon. In this case, all CDPs will be correlated. The absolute minimum number of points that must be picked is 4 (four), the first and last CDPs on the first and last lines. Additional lines can be picked as long as the first and last points on any line are picked. Additional points can be added on the picked lines as required to define the regional geology.
11-6
Landmark
This may be viewed in 3d also. Given a case where 4 points are picked, only the CDPs that fall inside the diamond will be picked.
Xline number
Inline number x x
Landmark
11-7
Pick the Autostatics Correlation Gate 1. Build the following ow: This flow is similar to the display inlines flow. You may elect to copy that flow to save some work.
Bandpass Filter
The default parameters will be adequate
Trace Display
Number of ENSEMBLES / screen ----------------------------10 Primary trace LABELING ---------------------------------- NONE Secondary trace LABELING ----------(ILINE_NO) 3d inline MODE of Secondary trace annotation ------------ Different 2. In Disk Data Input, input your most recent stack dataset. Sort the input to read all crosslines on lines 1,10-30(10) and 42. You will pick gate center information on the rst, last and every 10th line in between. 3. Apply a lter, and scale for cosmetics. 4. In Trace Display, plot 10 ensembles and annotate Inline as Primary and Xline as Secondary. 5. Open a parameter table to pick an autostatics horizon. You will select a gate width (300 ms is suggested) and a smash (11 is the default and is irrelevant since we are using an external model).
11-8
Landmark
You might want to think about a naming convention for these tables since you may elect to test a few different ones. One convention you might use is a gate number followed by a description: 01 - 300 wide centered at 650 6. Pick a few points on each line, making sure you get the rst and last CDP on each line. 7. You may want to center the gate on the event at about 650 ms on all displayed lines. Remember, these gates are discreet CDP time values. You do not want a pick time at every CDP nor do you want to snap the picks. 8. Exit Trace Display and Save the table using the File Exit/Stop Flow pull down menu.
Landmark
11-9
11-10
Landmark
3. In External Model Correlation, input your model stack and autostatics horizon le. You can use the default values for most of the remaining parameters. We will however manually specify a NEW value for the output pick times and use a 4 digit code similar to FX11 which means FX model, Gate 1, rst half. 4. In Disk Data Output, enter an output dataset le. Disk Data Output is needed to create a dataset that contains the actual cross correlation traces for EMC Xcor Sum Autostatics. There will be one correlation trace for every trace in your prestack dataset falling within the bounds of the picked CDP locations.
QC the Picks from the First Half 1. Make a simple plot using XDB of the TRMFX11 attribute from the TRC database.
Picks after the rst correlation Notice that picks only exist for the first half of the traces.
Landmark
11-11
Correlate the Second Half 1. Copy the rst half correlation ow to build the following ow:
11-12
Landmark
QC the Picks from the Second Half 1. Make a simple plot using XDB of the TRMFX12 attribute from the TRC database.
Picks after the second correlation Notice that picks only exist for the second half of the traces.
Landmark
11-13
Merge the Two Attributes into a Single Continuous Set 1. Build the following ow:
11-14
Landmark
3. The second merge combines the QLT values together. 4. Execute the ow.
QC the merged values Again, this is NOT something you would do with millions of traces. 5. Open the Database and produce a simple plot of the TRMFXY01 values by double clicking the attribute. You should see a continuous plot with values for all traces.
Merged Picks
Landmark
11-15
11-16
Landmark
Expand the Flow to Run the Xcor Sum Decomposition The Xcor Sum decomposition requires the trace files that contain the correlation traces. In our case we have two separate files that need to be combined into a single file before decomposition.
You can use the default values for the rest of the parameters but look up any parameters you do not understand in the helple. 3. We will output a NEW attribute to the database and give it a name that reects these statics as having originated from using the FXY decon model and correlation gate number 1 (FX01). 4. Execute the ow.
View the Results Using XDB 5. View the SRF and SIN database entries created by Gauss-Seidel called SGEMFX01. 6. View the SRF and SIN database entries created by Xcor-Sum called SPEMFX01. You can do both simple graphs and/or you may elect to generate some color contour plots using the Field option in the 3D display functions.
11-18
Landmark
Disk Data Input Apply Residual Statics Reproduce Traces SPLIT Normal Moveout Correction Trace Muting Stack 3D Trace Display Label Disk Data Output END_SPLIT SPLIT Bandpass Filter Automatic Gain Control Resample / Desample Disk Data Output END_SPLIT
This is a complex flow. The basic steps can be summarized as follows: Apply the residual statics. Apply Normal Moveout and the post NMO mute and produce a stack of the data with the residual statics applied. Select only the lines for velocity analysis
Landmark
11-19
Apply pre-velocity analysis processing and output a temporary dataset built specically for velocity analysis. Read the rst half shot le with preprocessing applied.
Reproduce Traces
Trace grouping to reproduce --------------------- -Ensembles Total number of datsets -------------------------------------------2 1. Input the shot organized dataset for the rst half of the project that has the refraction statics applied. 2. Apply the residual statics that were generated by the Gauss-Seidel External Model Decomposition process. These are the SGEMFX01 values for the shots and receivers. 3. Use Reproduce Traces to make two copies of each ensemble. Once copy will be stacked. The other copy will be processed in preparation for velocity analysis.
11-20
Landmark
SPLIT
Trace selection MODE ------------------------------------ Include Primary trace header ------REPEAT (Repeated ensemble Secondary trace header ----------------------------------- NONE SPECIFY trace list --------------------------------------------------1/
Trace Muting
SELECT mute parameter le ----- post nmo mute (brute)
Stack 3D
Minimum in-line number ------------------------------------------1 Maximum in-line number ----------------------------------------26 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79
END_SPLIT
4. Split the ow and pass the rst REPEATed copy through the stack path. 5. In Stack 3D, set the minimum and maximum lines to be contributed to by this input dataset. (This should already be set from the original ow to lines 1 - 26.) 6. In Trace Display Label, identify this as the residual statics stack on the rst half of the project. 7. In Disk Data Output, output a stack with residual statics for the rst half of the project.
Landmark ProMAX 3D Seismic Processing and Analysis 11-21
Split the Flow and Output the Data for Velocity Analysis Our first task here is to decide on the velocity locations. At this point we are not exactly sure what we will be using for supergathers so we will gather a couple of lines on either side of the velocity lines.
79
60
Xline 40 Number
20
1 1 5 15 25 Inline Number 35 42
Proposed CDP Center Positions For Analysis We will choose to output the velocity lines and 2 lines on either side as well. This will yield lines 3-7, 13-17, 23-27 and 33-37.
11-22
Landmark
SPLIT
Trace selection MODE ------------------------------------ Include Primary trace header--- REPEAT (REPEATED ensemble Secondary trace header ----ILINE_NO (3d inline number SPECIFY trace list ---------------2:3-7,13-17,23-27,33-37/
Bandpass Filter
The default parameters will be adequate
Resample/Desample
Output sample rate -------------------------------------------------8. All other input variables ------------------------------DEFAULT
END_SPLIT
1. SPLIT the ow passing the second REPEAT copy. In this SPLIT we will also select to only pass traces that contribute to our proposed velocity lines. At this point we are not sure exactly which lines we want so we will gather 5 lines around each of our velocity lines (5, 15, 25 and 35). 2. Apply a bandpass lter and AGC to the data. 3. Resample the data to an 8 ms sample period. 4. Output a pre-stack dataset with Bandpass Filter, and AGC applied that is ready for velocity analysis in 8 bit format. 5. End the split. 6. Execute the ow.
Landmark
11-23
Run Stack3D on the Other Superswath 1. Copy the ow for the rst half stack to a new ow for the second half:
Apply Residual Statics Reproduce Traces SPLIT Normal Moveout Correction Trace Muting Stack 3D
Minimum in-line number ----------------------------------------16 Maximum in-line number ----------------------------------------42 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79
END_SPLIT SPLIT Bandpass Filter Automatic Gain Control Resample / Desample Disk Data Output
Output Dataset Filename ------temp - input to velanal(2)
END_SPLIT
2. Change the Disk Data Input input le name to the shot-organized le or the second half of the project. 3. Edit the Stack 3D parameters to reect the inlines that are contributed to by this input dataset. In this case the second le contributes to lines 16 through 42.
11-24
Landmark
4. Change the Disk Data Output le name to be a le for input to velocity analysis for the second half of the project. 5. Execute the ow.
Landmark
11-25
11-26
Landmark
Stack Merge 3D
Enter name of host ---------------------------------------------------Operating system of host----------------- (as per instructor) Restart with an existing stack? ------------------------------No Minimum in-line number ------------------------------------------1 Maximum in-line number ----------------------------------------42 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Size of input trace memory buffer (MB)---------------------- 4 Size of stack trace memory buffer (MB) ----------------------4
Landmark
11-27
11-28
Landmark
Build the Eigen Stack External Model In this exercise you will build an Eigen stack model to use as input to external model correlation.
Inline Sort
Select new PRIMARY sort key ----------------------------- CDP Select new SECONDARY Sort Key ------------------OFFSET Maximum traces per output ensemble ---------------------16 Number of traces in buffer ----------------------------------7000
Eigen Stack
Mode ---------------------------------------------Output Eigenstack Get matrix design gates from DATABASE --------------- No SELECT Primary gate header word --------------------- CDP SELECT Secondary gate header word ---------------NONE SPECIFY matrix gate parameters ----------- 1:400-1200/ Type of Computations ? -------------------------------------- Real Horizontal window width -----------------------------------------3 Number of iterations ------------------------------------------------0 Apply nal datum statics after stack? -------------------Yes
Landmark
11-29
These les have NMO, mute, bandpass lter, and scaling applied and have been resampled to an 8 millisecond sample period. 8. Enter the Eigen stack parameters as shown. 9. In Trace Display Label, apply a header label. 10. In Disk Data Output, enter an output dataset. 11. Display a few lines to QC the model dataset using a Display inlines or Inline comparison ow.
11-30
Landmark
Chapter 12
Landmark
12-1
12-2
Landmark
40
Crossline Number
20
Inline Number
15
Landmark
12-3
Supergather Generation and Offset Distribution QC In this exercise, we will build supergathers using the 3D Supergather Formation* process. Here we will QC the supergathers for trace content and offset distribution prior to running the velocity analysis. We will compare offset distributions for a 3x3 spatial supergather, a 5x5 spatial supergather and a 9x1 inline supergather.
12-4
Landmark
3D Supergather Formation*
Read data from other lines/surveys? --------------------- No Select dataset ----------------------------------------------------temp Presort in memory or on disk? ----------------------- memory Maximum CDP fold ------------------------------------------------16 Minimum center inline number --------------------------------- 5 Maximum center inline number ------------------------------ 35 Inline increment -----------------------------------------------------10 Inlines to combine ----------------------------------------------------3 Minimum center cross line number ------------------------- 20 Maximum center cross line number ------------------------ 60 Crossline increment -----------------------------------------------20 Crosslines to combine ----------------------------------------------3
Inline Sort
PRIMARY sort key ---------------------------------------- SG_CDP SECONDARY sort key ----------------------------------- OFFSET Maximum number of traces per output ----------------- 300
Trace Display
Number of ENSEMBLES per screen ------------------------ 15 Header Plot Parameter ---------------------------------- OFFSET Primary trace LABELING ------------- ILINE_NO (3d inline) Secondary trace LABELING ---------- XLINE_NO(3d xline)
Landmark
12-5
2. Use Disk Data Input, Disk Data Insert and Disk Data Output to combine the two les that were built as input to velocity analysis into a single temporary le. 3. Select the 3D Supergather Formation* parameters. Although 3D Supergather Formation* is labeled as a stand-alone process (*), it does not output any data within the module. Therefore an output tool such as Disk Data Output or a display tool such as Trace Display or Velocity Analysis should follow the supergather formation in the same ow. The input le is the preprocessed le that we made specically for the velocity analysis in 8 bit format with lter and AGC applied. The Maximum CDP fold is 16. Dene your supergathers to be centered at inlines 5, 15, 25, and 35 and at crosslines 20, 40, and 60. Combine 3 CDPs along inlines and crosslines. This will initially dene a 3 by 3 box around the center CDP. Each spatial supergather generated by 3D Supergather Formation* is assigned a different value of the new header word, SG_CDP. 4. Sort the data input SG_CDP ensembles with a secondary sort of OFFSET. This will allow us to alter the display of the supergathers so that we can see the supergather effect in linear offset space There will be approximately 300 traces per output ensemble after the Supergather Bin sort. 5. Display the supergathers for QC. Display 15 ensembles with ILINE_NO as the primary annotation and XLINE_NO as Secondary Annotation. This will give a good idea of which inlines and crosslines belong to each SG_CDP ensemble. Use the Header plot option in the menu to plot the trace offsets above the supergathers. You should look for linearity of the offsets. 6. Experiment with different supergather parameters and see if any work better than others to improve the offset distributions.
12-6
Landmark
Try using 1 inline and 9 consecutive CDPs at each location. Minimum Line = 5 Maximum Line = 35 Increment =10 Inlines to combine=1 Minimum Xline =20 Maximum Xline = 60 Inc = 20 Crosslines to Combine= 9
Landmark
12-7
Run the Precompute 1. Expand the Velocity Analysis ow and edit it to include Velocity Analysis Precompute and Disk Data Output.
12-8
Landmark
3D Supergather Formation*
Select dataset ----------------------------------------------------temp Use other parameters from before for lines and xlines Inlines to combine ----------------------------------------------------1 Cross lines to combine ---------------------------------------------9
Landmark
12-9
2. Parameterize the supergather formation macro. Dene your supergathers to be centered at inlines 5, 15, 25, and 35 and at crosslines 20, 40, and 60. Combine 5 CDPs along inlines and crosslines. This will dene a 5 by 5 box around the center CDP. (Minimum inline=5, maximum inline=35, inline increment=10, inlines to combine=1; minimum crossline=20, maximum crossline=60, crossline increment=20, crosslines to combine=9). 3. Parameterize the Velocity Analysis Precompute. Use 9 CDP gathers per CVS strip. Note: The gather is built from a 5x5 matrix, but the stack panels will be 3 traces from 3 consecutive lines. Use 0 for the offset of the rst bin center and 110 for the offset bin size. Choose a velocity range from 8500 to 16000 ft/sec and create 21 panels. 4. Enter a new data set name in Disk Data Output. 5. Execute this ow.
12-10
Landmark
Velocity Analysis 1. While the job is running, edit the ow to toggle everything off and add Disk Data Input and Velocity Analysis.
Velocity Analysis
Select display DEVICE: -----------------------------This Screen Is the incoming data Precomputed?: ----------------------Yes Set which items are visible?------------------------------------No Set semblance scaling and autosnap parameters?:--No --------------------------------------------------------------------------Display horizon(s)?: -----------------------------------------------No Use neural network velocity picker?: -----------------------No Interact with other processes using PD?:-----------------Yes Get guide function from existing param. table? ---Yes Vel. guide function table : ------imported from ascii le --------------------------------------------------------------------------------Maximum stretch percentage for NMO: --------------------30 Long offset moveout correction?:-------------------------NONE Interval velocity below last knee: ------------------------------0 Table to store velocity picks: ----after resid stat b4 DMO Copy picks to next location--------------------------------------No 2. In Disk Data Input, input the dataset from the Velocity Analysis Precompute execution of the ow. Turn IDA on by going to SORT mode. This will allow Velocity Analysis to communicate with the Volume Viewer for Velocity eld QC.
Landmark
12-11
3. In Velocity Analysis, choose Yes for Is the data precomputed?. 4. Set the Velocity Analysis parameters. When you rst parameterize the Velocity Analysis process, a subset of the parameters will be visible, so begin by setting the global parameters highlighted in the ow. Be sure to create a table to store velocity picks such as after resid stat b4 DMO. Next, select Yes for Set semblance scaling and autosnap parameters to display the semblance submenu. The default settings will work fine so turn off the semblance submenu by clicking No for Set semblance scaling and autosnap parameters. The submenu parameter settings will be retained and used even though they are not visible. 5. The parameter Set which items are visible works the same way. Both the visibility and semblance parameters can also be changed interactively from within the velocity analysis tool. 6. Use the previous velocity eld as a guide. 7. Execute the ow and begin picking velocities in the Velocity Analysis display. The velocity semblance, a corresponding CDP supergather, and the CVS strips are displayed. Another available display is a peak semblance histogram with interval velocities derived from the RMS picks 8. Begin picking a velocity table by using the Pick icon, and pick a function. As you pick velocities on the semblance plot, the picks are also displayed on the CVS strips and the interval velocity plot is modied. You may also pick velocities on the CVS strips. Note the CDP, ILN, and XLN values that appear in the upper left hand corner of the display. These provide the center CDP value, the range of inlines and the range of crosslines included in the current velocity analysis supergather. Your rst velocity analysis center should have values of CDP 336, ILN 3-7, and XLN 18-23.
12-12
Landmark
Velocity Analysis Icons Next ensemble: Proceed to and process the next ensemble in the dataset. If you are currently processing the last ensemble in the dataset, this button will rewind the data and bring up the rst ensemble. Previous ensemble: Step backward one ensemble and process the ensemble before the current ensemble. If you are currently processing the rst ensemble of the dataset, this button is grayed out and does not do anything. Rewind: Rewind the dataset and go back to the rst ensemble as specied in the sort order.
Point Dispatcher(PD) save and send the velocity picks in the current ensemble to the Velocity Viewer/Editor.
9. After you pick the rst location and move to the second you may want to overlay the function that you just picked as a second guide. You can do this by clicking on View Functions Ghost functions Average of all CDPs. This will display the average of all of the functions that have been picked in the output table to date. Your velocity picks are automatically saved to an RMS velocity ordered parameter le when you move from one location to the next or Exit the program. You also have the option to save picks using the Table/Save Picks option.
Using the Volume Viewer As you pick velocities along a line using the Velocity Analysis tool, you may want to QC your new velocity field. This can be accomplished by simultaneously viewing a color isovelocity display of the entire velocity volume. The tool used for this is a standalone process called the Volume Viewer/Editor, and should be executed while you are running Velocity Analysis, as outlined below.
Landmark
12-13
1. After picking one or two locations, iconify the Velocity Analysis Window. 2. Return to the ProMAX User Interface. Build a new ow to run the Volume Viewer/Editor.
Volume Viewer/Editor
Work in Time or Depth---------------------------------------- Time Unit System ------------------------------------------------Database Surface Coordinates--------------------------------------ILN/XLN Source of depth coordinate------------------------------ Volume Source of surface coordinate limits------------------- Volume -------------------------------------------------------------------------Input Volume Type------------------ Stacking (RMS) Velocity Select input volume--------------after resid stat b4 DMO Velocity volume reference datum---------------------Floating -------------------------------------------------------------------------Display poststack seismic data ------------------------------No -------------------------------------------------------------------------Interact with Velocity Analysis------------------------------ Yes Display gather locations----------------------------------------Yes 3. Set the parameters for Volume Viewer/Editor. Make sure you input the same velocity volume (table) that you are currently using in Velocity Analysis. Also, make sure you select Yes to Interact with Velocity Analysis? This will enable the PD (pointing dispatcher) to communicate with the Velocity Analysis already running. 4. Execute the ow containing the Volume Viewer/Editor, and return to the Velocity Analysis display. Two Volume Viewer/Editor windows will appear: ProMAX/Volume Viewer: Map and ProMAX/Volume Viewer: Cross Section. You will
12-14
Landmark
want to try different ways of arranging the windows on the screen until you have made an arrangement that is workable for you. The following diagram is one possible way to arrange the windows on the screen:
Possible Window Arrangement If you have not yet picked any velocities in Velocity Analysis, the velocity displays will contain zero values and therefore the screen will be all blue and the velocity and space scales will be very large. If you have picked at least one velocity function, you will only see a vertical color variation in the Cross Section window. The Map window displays a time slice through the current velocity volume at the position of the heavy, gray line that appears in the Cross Section window. You can change the time slice by activating the Select a horizontal slice icon in the Cross Section window and clicking MB1 at the appropriate time in the Cross Section window. The Map window also displays an outline of your 3D survey grid. The Cross Section window displays a vertical cross section through the current velocity volume at the position of the heavy, gray line
Landmark ProMAX 3D Seismic Processing and Analysis 12-15
that appears in the Map window. You can quickly change to a vertical cross section oriented 90 degrees to the current Cross Section display by clicking on the Select the perpendicular slice icon. Clicking MB1 will alternately display perpendicular vertical cross sections at the position of your cursor. You can also change the vertical cross section in the Cross Section window by activating either of the Select a vertical slice icons in the Map window. Click MB1 in the Map window at the appropriate position and the Cross Section window will be updated with the velocity cross section at that position. 5. From the Cross Section window, click View, and then Volume Display. A Volume Controls window will appear. Click the Cross-section Nodes and Map Nodes , then Ok. This will display symbols in the Map window and vertical lines in the Cross Section window indicating the positions of the Velocity Analysis centers already saved to the velocity table. The locations of these symbols and lines are referred to as nodes. You may also want to reduce the search radius for functions to display in the cross section view. For this project set the search radius to about 200 ft. 6. In the Velocity Analysis window, pick or modify the velocity function for the current location. In the Velocity Analysis display, click on the bow-and-arrow icon to send the information to the Volume Viewer/Editor. The velocity displayed in Volume Viewer/Editor updates in response to picks made in Velocity Analysis. You should now see a vertical line in the Cross Section window and a circular symbol in the Map window at the X, Y location of the velocity function just picked. 7. In the Velocity Analysis window, click on the Process next ensemble icon, and pick the next analysis location. When you are nished picking this new analysis location, click on the Process next ensemble icon again. This will not only move you to the next analysis location, but will automatically send the velocity picks just made to the Volume Viewer/Editor displays.
12-16
Landmark
You should now have many symbols/lines in the Map and Cross Section windows. Your velocity eld should also have changed color slightly based on the velocity changes just added. In either the Map window or the Cross Section window, click on the PD icon. Any Velocity Analysis CDP location can be easily retrieved or deleted from Volume Viewer/Editor through the use of the mouse. This allows random access to any of the precomputed and picked locations.
Velocity Analysis PD Tool: By activating this icon, you can select a CDP and send it to Velocity Analysis. This icon does not appear if No was selected for Interact with Velocity Analysis? in the Velocity Viewer/Editor menu. With the PD icon activated, position the mouse cursor over a node. The cursor should change from an x to an o. Click MB1 to retrieve that velocity function into the Velocity Analysis display. Clicking MB2 deletes that analysis location. 8. Continue picking velocities in Velocity Analysis until you nish all of the locations for this project. Remember, you may either use the bow-and-arrow icon to send the picks from Velocity Analysis to the Volume Viewer/Editor displays for QC before moving to the next analysis location, or you may move directly to the next ensemble and your previous picks will be automatically sent to the Volume Viewer/Editor displays. 9. To nish picking in Velocity Analysis, click on the File Exit/ stop ow pull down menu in the velocity analysis and the File Exit pull down in the Volume viewer.
Landmark
12-17
12-18
Landmark
Chapter 13
3D Dip Moveout
3D Dip Moveout (DMO) is a dip-dependent partial migration that transforms nonzero-offset seismic data into zero-offset seismic data. This yields improved (dip independent) velocity estimates, attenuates coherent noise, and improves lateral resolution. In this chapter we discuss how to run DMO to Gathers 3D and DMO to stack 3D.
Landmark
13-1
Offset Binning Parameter Determination We can combine the features of XDB and DBTools to help identify the proper parameters for the offset Binning. To visualize the problem lets first generate two displays from DBTools. 1. Generate a predened CDP fold map. Change to a monochrome color map and choose a color of your choice. 2. Generate a 2D Crossplot from the TRC order of OFFSET, CDP, ILN, ILN 3. Using the rectangular selection icon, select a rectangle over all CDPs but for a very narrow range of offset. 4. Project this group of selected points to the CDP order. Notice that the population is poor. Play with the width of the selection polygon
13-2
Landmark
in offset until you get good population in CDP space within an offset bin.
From this exercise it is clear that we are in severe trouble here. It is virtually impossible to get good population in CDP space for this project without using very wide offset bins. A typical workflow here would be to use DBTools to get an estimate of the offset bin width and then use XYGraph to determine the offset bin minima and maxima and the number of offset bins required. 5. From XDB, generate a 3D: XYGraph from the TRC OPF of OFFSET, CDP, ILN. This graph will show you the offset distribution of your entire 3D dataset. It can be used to help you determine an appropriate number of bins, and minimum and maximum offset. You can use these values when selecting DMO to Gathers 3D parameters. 6. Change the color table to use the table called contrast.rgb using the Color Edit and File Open pull down menus. 7. Zoom in and display 5 or 6 lines on the screen.
Landmark
13-3
Typically for land 3d surveys this analysis should be done on more than one line.
Offset vs. CDP plot for 5 lines From this plot we need to conrm how wide the offset bins need to be to get good population in the offset volumes and to determine the minimum and maximum offsets to enter into the DMO to Gathers 3D menu. Remember that our goal is to get at least 1 trace per CDP per offset bin. You can overlay a grid on this display, resize the bins to your proposed DMO offset bin width to start analyzing the data for proper bin width determination. 8. Click on the Grid Display pull down menu. Nothing much happens except that the bin editing icons appear on the left side of the screen. 9. Click on the Grid Parametrize pull down menu.
13-4
Landmark
A small dialog box will appear where you can change the characteristics of the grid.
1660
10. Change the value of the Y origin to a value near the bottom of your zoomed window. In this example we can set it to 1660 as shown in the previous diagram. 11. Click the Green Light Icon. You should get some lines drawn on the XYgraph.
zoom window
grid lines
Landmark
13-5
How wide does the offset bin need to be to get 1 trace per CDP?
Half Diamond Explanation Here we must decide on the offset bin width. One way to do this is to measure the half diamond distance using the double fold icon MB3 function. You should measure values around 600 or 700 ft. Given our group interval is 110 ft. Lets try an offset bin width of 660 ft. and see what results we get. i.e. how many offset bins and population in the bins. 13. Click on the Grid Parametrize pull down menu. Set the Cell Size across Azimuth to 660 ft. and click the green light icon. 14. Move the Grid with so that the edge of the rst bin is at 0 offset.
13-6
Landmark
This gives us reasonable coverage within the offset bins, but the far offset volume is very sparsely populated and we only have 6 live offset planes. This will yield 6 traces per CDP after 3D DMO to Gathers.
6 Offset Bins of 660 ft each For our purposes we will set the offset bin increment to 440 in order to run through the mechanics of the DMO to GATHERS. It is obvious now that it would be geophysically incorrect to run the DMO to Gathers with these parameters and that a width of 880 ft. would actually be required. 15. Reset the Grid parameters by clicking on the Grid Parametrize pull down menu. Set the Cell Size across Azimuth to 440 ft. and press the green light icon.
Landmark
13-7
16. Move the Grid with so that the edge of the rst bin is at 0 offset.
17. Move the cursor to the rst line and read off the offset. It should be ZERO. Move the cursor to the line past the end of the live data. It should be 4400 ft. 18. Count the number of bins. There should be 10. The minimum offset, maximum offset, offset bin width and number of bins are the parameters that we need for DMO to Gathers 3D. 19. Exit from the XYgraph using the File Exit Conrm pull down menu.
13-8
Landmark
DMO to Gathers 3D
DMO to Gathers 3D is a constant velocity DMO process that accumulates the DMO response of each trace in a partial stack indexed by CDP, inline location, crossline location, and offset. This process uses an Integral (Kirchhoff) method with care taken to avoid spatial aliasing the DMO operator. There are many different alternatives that you can use at this point in the flow. The following questions will help us choose one: Which velocity lines do you want to resolve in this job? Lines 5,15 25 and 35
Which input data les do you need to resolve these lines? Both superswaths
Are you running on an SMP machine and do you need to parallelize the pre-processing as well as the DMO? This will vary depending on where the course is being taught.
Have you adequately allocated scratch space? This project is small and you should have enough scratch allocated to output 4 lines of 79 crosslines and 10 traces per CDP.
Are you going to parallelize the DMO? No parallelization will be used (depends on available hardware at the class site).
Landmark
13-9
Trace Muting
SELECT mute parameter le ------ post nmo mute (brute
DMO to Gathers 3D Trace Display Label Disk Data Output Normal Moveout Correction Trace Display Label Disk Data Output
2. In Disk Data Input and Disk Data Insert, input the two shot organized les with refraction statics applied. 3. Apply the residual statics that were generated by the Gauss Seidel External Model Decomposition process. These are the SGEMFX01 values for the shots and receivers.
13-10
Landmark
DMO to Gathers 3D
Enter name of host ----------------------------Number of worker threads --------------------------------------- 1 Minimum in-line number ------------------------------------------5 Maximum in-line number ----------------------------------------35 in-line number output sampling interval ------------------10 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79 x-line number output sampling interval ---------------------1 Number of consecutive gathers to output at each loc --1 Typical CDP spacing in ensembles---------------------------55 Minimum offset to retain -------------------------------------------0 Maximum offset to retain ------------------------------------4400 Number of offset bins ---------------------------------------------10 Offset sampling ---------------------------------------------OFFSET Typical mute time at largest offset ----------------------1000 Typical RMS velocity at early times ---------------------9000 Apply v(z) correctoin? ---------------------------------------------No Amplitude and phase balancing mode ---use exponent.. Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Re-kill dead traces and apply stack mutes ------------- No Size of input trace memory buffer (MB)-----------------------4 Size of stack trace memory buffer (MB) ----------------------4 5. Apply the post NMO muteSelect the DMO to Gathers parameters. In DMO to Gathers 3D, leave host name blank. Host name refers to the name of the hosts, or nodes, where you would like to run the program. This program is set up to run in
Landmark
13-11
parallel (on more than one machine). If no host name is specied, the process executes on the same node as the ProMAX executive. Make the crossline number sampling interval 1 and the inline number sampling interval increment by a sampling interval for performing velocity analysis. In this case, use 10 to output every tenth inline starting at 5 and ending at 35. 6. Set the CDP spacing is 55 ft. 7. Set the number of offset bins to 10 and the offset range to start at 0 ft. and end at 4400 ft. The number of offset bins parameter governs the number of output traces per CDP. 8. Values of 1000 ms mute time at far offset and 9000 ft./sec shallow velocity is satisfactory. 9. Specify 100 for the number of fold normalization scalers per trace. The number of fold normalization scalars parameter controls the time interval for which a record of the time-variant fold at each CDP is maintained. Greater numbers better preserve relative amplitudes, but the program runs longer. 10. Do not rekill or remute the output traces. 11. Default the size of memory buffers and the number of ensembles per page parameters.
13-12
Landmark
The size of memory buffers parameters are available to maximize the efciency of the program. Try to NEVER allow (stack buffer + 2*input buffer) > 1/2 machine memory..
Landmark
13-13
For this exercise, we will not be using the output data. You may elect to QC the output data by running a quick ow do display a few CDP ensembles.
Notice that all CDPs after DMO to Gathers have the same offset distribution. This poses a couple of questions: What are the proper parameters for minimum offset, maximum offset and offset increment in the Velocity Analysis Programs? How do you decide on a supergather for velocity analysis? Offset distribution is no longer an issue.
13-14
Landmark
We have been using the Manual Parallel approach in all of the stack jobs that we have run. The following table helps summarize the types of parallelization that exist in ProMAX 3D.
Stack 3D DMO 3D Migrations FK and PS Migrations FD and PSPC Mixing Panel Tools
OK
OK
OK
OK
OK
OK
OK
The Parallel Executive (PVM) allows you to run most ProMAX tools in parallel, the exception being mixing panel tools (i.e. Trace Mixing, 3D Mix, and other tools that require a trace to be in two places at the same time).
Landmark
13-15
A distributed,or networked parallel session is a heterogeneous network, such as SGI, SUN, IBM, appearing as a single concurrent computational resource. Typically when you classify parallel processing there are three types: PVM Parallel: This mode begins with a Parallel Begin and ends with a Parallel End. The tools in-between are run concurrently on the dened hosts. Multiple Parallel Begin and Ends can be nested if required. The output data from all of the different execs running on the various machines are brought back together to the master ProMAX ow on the master machine. Threaded (or SMP) Parallel: This mode may be run on machines that support the Shared Memory architecture. Typically these machines have multiple nodes or CPs that all use the same memory. There are tremendous performance advantages to these machines. Currently only the 3D Stack, 3D DMO and 3D Migration programs fully utilize the SMP architecture. Massive Parallel This generally refers to running more than one multi node SMP machine simultaneously in the same job. Currently only the 3D Stack and 3D DMO programs support massive parallelism. For the PVM (Parallel Begin/End) path there are two methods of distributing the work amongst the selected machines. These are Round Robin and First-Come-First-Serve. The round-robin allocation to distribute work guarantees that the sort order is preserved. However, the round-robin allocation strategy may not be optimal for heterogeneous environments with a mix of machine speeds, or machines being utilized by other processes, since faster machines will tend to wait for slower ones to catch up. The first-come-first-served strategy allocates work on a request basis. Faster machines generate more requests than slower ones, and will remain busy until the task is finished. The first-comefirst-served strategy does not guarantee that the primary sort order will be maintained. If it is important that the primary sort order be maintained in a parallel job, a good strategy is to place an Inline Sort after a parallel sequence using the first-come-first-served policy.
13-16
Landmark
User environment In order to use the Parallel Executive on remote hosts, you must have trusted login privileges on all nodes of the virtual machine. Trusted status can be enabled system wide by modifying the /etc/hosts.equiv file, or on an individual basis by adding the relevant entries to your .rhosts file. You can verify trusted status by attempting to rlogin to a remote host. If you do not need to supply a password, you have trusted status. An alternate test is to check whether rsh [hostname] date returns the date string without requiring other input, or without echoing other strings. Each thread of the Parallel Executive inherits all ProMAX and PVM environment variables, as well as the DISPLAY environment variable, from the initiating environment. The primary data area must be available on all participating hosts, either through hard mounts or by using automount. Only the primary storage partition must be declared in
Landmark
13-17
this manner, secondary storage partitions will follow the config_file entries.
NOTE: Currently, the path to the primary data area must be the same on all the parallel hosts. This can be accomplished by ensuring that the mount points for primary storage ($PROMAX_DATA_HOME) are the same on all the participating parallel machines. If the common primary storage partition is already mounted at a different location, you must provide a link to that mount point which mimics the mount point on the other machines.
The Display environment variable need only be set if remote threads will create displays that will be viewed on the local host. In this case you must explicitly enable display on the local host, either by enabling all hosts xhost+, or by enabling specific hosts xhost +host1 +host2, from a console on the local host.
13-18
Landmark
Landmark
13-19
Dipvels Theory
Estimates velocity and dip from the stacking response of the data. Useful for obtaining correct stacking velocities for datasets which have considerable dip and need residual statics
dt dt
Stacking Velocity =
alpha = dip angle -- theta=dip direction -- phi = shot-receiver azimuth Note: Stacking velocity is trace dependent, if the shot to receiver azimuth varies within a CDP gather.
13-20
Landmark
3D NMO and Bin Center Corrections trace - mid point CDP - BIN Bin-Center
T(0,0)
T(0,R)
3D NMO equation 2 2 2 1/2 T(X,R)=(T(0,0)+R*D) +(X/Vrms) -(X/VDMO) ) R= {Rx,Ry} Position Vector from bin center to the trace mid point D= {Dx,Dy} X and Y component dip The NMO, DMO and bin center corrections (R*D) require knowledge of Vrms, Dx and Dy versus time and space
Landmark
13-21
DMO Stack 3D
DMO Stack 3D is a constant velocity DMO process that accumulates the DMO response of each trace in a stack indexed by CDP, inline, and crossline location. Like DMO to Gathers 3D, this process uses an Integral (Kirchhoff) method, with care taken to avoid spatial aliasing of the DMO operator. Unlike DMO to Gathers 3D, DMO Stack 3D stacks all offsets into a zero-offset stack trace. To compensate for variations in coverage, the process maintains timedependent fold information for each stack trace, and scales each stack trace by that fold. The fold information is maintained in the stack trace headers and can be accessed by other processes. We will also combine the preparation and DMO in a single flow. If the DMO was to be run in parallel on an SMP type machine you may want to also parallelize the pre-processing as well. If the preprocessing is not parallelized, the DMO may not run efficiently because it will be waiting for traces to come down the pipe. Parallelizing the preprocessing may help the pipe keep up DMOs demand for traces. This is something may should be tested. The alternative would be to run the preprocessing in a separate flow and output a new prestack dataset. This dataset would then be read into the DMO flow. You will run this exercise differently relative to the stack flows that have been run previously. In this case you will run two flows but with slight parameterization changes to demonstrate a point. You will run the DMO to Stack 3D to generate a fully resolved output volume. The parameterization comparison will involve examining the difference between rekilling and remuting vs. rekill and remute. The reason for this comparison is to help demonstrate the procedures involved in the cases where partially resolved data volumes are generated and then merged.
13-22
Landmark
1. Build the following ow by copying your residual statics stack ow for the rst half of the data as a starting point:
Trace Muting
SELECT mute parameter le ----- post nmo mute (brute)
Automatic Gain Control DMO Stack 3D Trace Display Label Disk Data Output
2. In Disk Data Input, input your shot-organized data with refraction statics applied for the rst half. 3. In the Disk Data Insert input the shot organized data with refraction statics for the second half. 4. Apply the residual statics that were generated by the Gauss-Seidel External Model Decomposition process. These are the SGEMFX01 values for the shots and receivers.
Landmark
13-23
5. In Normal Moveout Correction, enter the best, current RMS velocity eld. Here we would normally use the velocities that were picked using the data that was output from the Dmo to Gathers 3D process. We did not run these velocities so you can use one of the other tables. 6. Apply the post-NMO mute. 7. The ow as described shows an Automatic Gain Control, you can toggle this process active. The output from the Stack DMO process is more aesthetically appealing (on this dataset) if you apply an AGC, or other scaling function, prior to the DMO process, although it is not required. 8. Select DMO Stack 3D parameters. Host name refers to the name of the nodes where you would like to run the program. This program is set up to run in parallel (on more than one machine). If no host name is specied, the process executes on the same host as the ProMAX executive. 9. Leave the restart parameter defaulted to No. The restart parameter allows you to alter an existing 3D DMO stack with the DMO stack response from another prestack dataset. 10. Specify your minimum and maximum inline and crossline numbers to include the entire dataset. 11. Set the CDP spacing to 55 ft. 12. Set the estimated maximum offset to 4500 ft., and the mute time at the largest offset to 1000 ms. The estimated maximum offset parameter and the typical mute time at largest offset parameter are used to determine the number of time shifts required in the DMO operator A shallow velocity of 9000 ft./sec is satisfactory. 13. Set the exponent of fold normalization scalers to 0.5. The exponent of fold normalization scalers will divide each stack trace by the fold normalization scaler to this power. A value of 0.5 is
13-24
Landmark
recommended for noisy data, while 1.0 is recommended for synthetic data.
Disk Data Input Disk Data Insert Apply Residual Statics Normal Moveout Correction Trace Muting Automatic Gain Control DMO Stack 3D
Enter name of host ----------------------------Operating system of host----------------- (as per instructor) Restart with an existing stack? ------------------------------No Minimum in-line number ------------------------------------------1 Maximum in-line number ----------------------------------------42 Minimum x-line number--------------------------------------------1 Maximum x-line number -----------------------------------------79 Typical CDP spacing in ensembles---------------------------55 Maximum offset to retain ------------------------------------4500 Typical mute time at largest offset ----------------------1000 Typical RMS velocity at early times ---------------------9000 Exponent of normalization factor -------------------------- 0.5 Number of normalization scalars per trace ----------- 100 Apply nal datum statics after stack? -------------------Yes Rekill dead traces and apply stack mutes -------------Yes Size of input trace memory buffer (MB)---------------------- 4 Size of stack trace memory buffer (MB) ----------------------4
Landmark
13-25
The number of fold normalization scalars parameter controls the time interval for which a record of the time-variant fold at each CDP is maintained. Greater numbers for this parameter preserves relative amplitudes better, but makes the program run longer. When using the restart option to merge datasets together, these scalers will help determine the relative weight of each dataset based on the fold of that dataset. 15. Apply the CDP mean static (nal datum static). 16. Set the rekill and mute apply switch to yes. This is one parameter that will generally need to be tested. As a general rule set this to YES especially if fully resolved output is being generated. If you detect shallow amplitude anomalies you may nd that setting this to NO will help. When working in a partial stack and stack merge sequence you will generally want to set this to NO and then rekill and remute after the stack merge. (To be demonstrated later in this chapter) 17. Set the size of memory buffers to 4 MB. The size of memory buffers parameters are available to maximize the efciency of the program. Never allow stack buffer + 2*input buffer to exceed half machine memory. In this case, we will be running many jobs at the same tim, so we will reduce the memory requirements for each job. 18. In Trace Display Label, label the dataset as being a stack after DMO. 19. In Disk Data Output, output a dataset for a stacked after DMO dataset. 20. Execute the ow.
13-26
Landmark
Run DMO to Stack3D with the rekill switch set to NO 1. Copy the DMO stack ow for the rst half of the data to make this ow:
Disk Data Input Disk Data Insert Apply Residual Statics Normal Moveout Correction Trace Muting Automatic Gain Control DMO Stack 3D
Rekill dead traces and apply stack mutes ---------------No
Landmark
13-27
Compare the two DMO Stack Volumes 1. This ow is a copy of the Compare Inlines ow, built earlier. You may want to copy that ow to save yourself some work.
Inline Sort
PRIMARY sort key ---------- (ILINE_NO) 3D inline number SECONDARY sort---------------(DS_SEQNO) Input dataset sequence number TERTIARY sort key----(XLINE_NO) 3D crossline number Maximum traces per output ensemble -------------------- 79 Number of traces in buffer ------------------------------------160 Buffer type ----------------------------------------------------Memory Sort key which controls End-of-Ensemble-----Secondary
13-28
Landmark
You have a couple of options available regarding how to handle this issue. When the traces are output from the DMO process the TLIVE_S and TFULL_S header words are set to be those of the earliest values encountered for a trace that originated in the output bin. Given the scenario where you are creating fully resolved output, the TLIVE_S and TFULL_S header words should match those of a conventionally stacked trace. The Rekill dead traces and reapply trace mutes menu option determines whether or not the samples above TLIVE_S are zeroed or not. Normally, for fully resolved output you will want to set
Landmark
13-29
this to YES. The following diagram shows a comparison between setting this option to NO and YES.
Notice that the TLIVE_S is identical for each dataset, but the trace amplitudes are very different in time and space. The stack volume with the switch set to YES will very closely resemble a conventional stack in space and time. In the event where partially resolved stacks are being generated by the DMO process, it is advised to set the switch to NO and then reset the trace amplitudes after the partial stacks are merged into a single fully resolved stack volume.
13-30
Landmark
Reset Stack DMO Output 1. Build the following ow by copying the stack merge ow as a starting point:
Disk Data Input Database/Header Transfer Disk Data Input Database/Header Transfer Trace Kill/Reverse Trace Muting Trace Display Label Disk Data Output
2. In the rst Disk Data Input, read one of your conventional stack les. 3. In the rst Database to Header Transfer, move one header word to the database. Remember, we are working on stack traces; therefore, each CDP number is different. Write the TRC_TYPE trace header word from the alternate list of header words to a new attribute in the CDP database called TRC_TYPE. These are integers. 4. In the second Disk Data Input, input the DMO stack data volume. 5. In the second Database to Header Transfer, move one header word from the database to the DMO stack trace headers.
Landmark
13-31
Write the TRC_TYPE attribute from the CDP database to a new trace header word called REKILL.
Database/Header Transfer
Direction of transfer --- Load FROM traces TO database Number of parameters ---------------------------------------------1 First database parm ------- CDP GEOMETRY TRC_TYPE First header entry ----------- TRC_TYPE (Trace type) - INT
Database/Header Transfer
Direction of transfer --- Load TO traces FROM database Number of parameters ---------------------------------------------1 First database parm ------- CDP GEOMETRY TRC_TYPE First header entry ------------------------------------------ REKILL
Trace Kill/Reverse Trace Muting Trace Display Label Disk Data Output
13-32
Landmark
Disk Data Input Database/Header Transfer Disk Data Input Database/Header Transfer Trace Kill/Reverse
Trace Editing MODE -------------------------------------------- Kill Get edits from the DATABASE? ------------------------------ No PRIMARY edit list header word ----------------------- REKILL SECONDARY edit list header word ------------------- NONE SPECIFY traces to be edited -------------------------------------2
Trace Muting
Re-apply previous mutes ------------------------------ Re-ramp
Landmark
13-33
13-34
Landmark
Chapter 14
Landmark
14-1
on the edge one line in from the edge dead in the corner
If there are no zero fold CDPs in the array, the taper value is 1. If there are zero fold CDPs in the array and the center CDP has non-zero fold, the taper number is calculated as:
Using Equation
INT(array area/2) - number of zero fold CDPs INT(array area/2). Center CDP is on the edge (rst/last line or rst/last xline) Taper value = 0 If CDP not on the edge then Total number of CDPs in window = 25 Total number of zero fold CDPs = 10 Taper value = (12-10)/12 = 0.17 Center CDP has fold >0 Total number of CDPs in window = 25 Total number of zero fold CDPs = 5 Taper value = (12-5)/12 = 0.58
Center CDP has fold > 0 Total number of CDPs in window = 25 Total number of zero fold CDPs = 6 Taper Value = (12-6)/12 = 0.50
Examples of Taper Calculation If the center CDP is on a dataset edge, the fold of the center CDP is zero, or the number of zero fold CDPs is greater than half the array area, the taper number is 0.
Landmark
14-3
CDP Taper
Top number of inline CDPs ------------------------------------- 5 Top number of cross line CDPs---------------------------------5 Bottom number of inline CDPs----------------------------------9 Bottom number of cross line CDPs----------------------------9
14-4
Landmark
Make sure that you turn off the AGC or you will remove the effect of the taper. You will also want to set the trace scaling in Trace Display to Entire Screen instead of Individual since the individual scaling will also partially remove the effects of the tapering. 7. With these parameters it will be difcult to see much difference since we essentially have a two trace taper shallow and a 4 trace taper at maximum time.
NOTE: Notice that the rst and last lines are completely dead after the CDP Taper. This is the expected behavior. Therefore, you may elect to pad the CDP grid by one CDP in all directions since CDP Taper will kill any trace on the rst and last inline and cross line of the project.
Landmark
14-5
QC Plots from XDB 1. Open XDB database display. 2. Select your area, line, and the CDP order. 3. Display an XYgraph of CDP: X, Y, FOLD. 4. Display an XYgraph of CDP: X, Y, TOPTAPER. 5. Display an XYgraph of CDP: X, Y, BOTTAPER. 6. Edit the Colorbar. Set the interpolation mode to MANUAL and the change the color of the rst color box to black. Here you can clearly see the original zero fold CDPs in the fold plot and you can see the traces which have been assigned a taper scaler of zero.
Try other values for TOPTAPER and BOTTAPER Rerun the flow using values like 11 and 21 for the top and bottom tapers and regenerate the QC plots.
14-6
Landmark
Chapter 15
3D Velocity Viewer/Editor
This stand-alone tool allows you to scan through a 3D velocity eld, identify and edit velocity control points, and analyze the interpolation between the control points. This tool also lets you smooth the velocity eld and convert stacking velocities to interval velocities. Typically, the tool is used to analyze velocities for anomalous points that you may want to edit. In particular, bad velocities are frequently created when converting stacking velocities to interval velocities. This tool ensures that a reasonable velocity eld is being passed to a migration.
Landmark
15-1
Move: Move view forward and back or up and down. Also used to ip to an inline view when in a crossline view and visa-versa.
15-2
Landmark
Edit vel function: Popup another screen to display and edit a selected velocity function.
Circles represent location of the velocity functions. Black lines indicates triangulation for spatial interpolation of velocity functions. Solid white lines mark last displayed inline and crossline views. Dashed white lines mark width of zone used to mark nearby velocity functions on the axis of an inline or crossline view.
3D Table Triangulation The above time slice view shows the triangulation used for spatial interpolation by ProMAX tables. After values in a table are interpolated
Landmark
15-3
vertically in time or depth, they are interpolated spatially using the 3 vertexes of the triangle that encloses the location to interpolate. The triangulation of the function locations is defined via the Delaunay approach that produces the most equilateral triangles possible.
15-4
Landmark
3D Velocity Viewer/Editor*
Select the type of eld to edit --------- Stacking (RMS) vel Do you wish to edit an existing table -------------------- yes Select input velocity----------------------------------- best vels Do you wish to specify the bounds of the eld?------- No Select output velocity database ---- smoothed for fk mig Specify and alternative name of output INTV --------- Yes Select output Interval velocity -------- for phase shift mig Minimum depth (or time) -------------------------------------------0 Maximum depth (or time) ------------------------------------------0 2. Input one of the RMS velocity elds that are available. If you did not complete the velocity eld picking you may use the original eld that we imported from the ASCII le. 3. Specify an output name for the edited RMS eld. 4. Enter an new name for the output interval velocity table. We will output two tables from this program. One edited and smoothed in preparation for FK Migration and another that is an Interval Velocity as a function of time for phase Shift 3D Migration. 5. Execute the ow.
Landmark
15-5
Edit and Smooth the RMS Velocity for FK Migration 1. Click on the Edit Icon and move the cursor into the display area. The screen will adjust to have two windows. On the left is the velocity contour and on the right is the velocity function edit window. Edit velocity function window
Edit Icon
Location of velocity function being edited Location of additional velocity function used as reference for plot on right side.
Conversion of velocity being edited to interval velocity (two different conversion methods are being used)
15-6
Landmark
Editing Velocities The Edit velocity function window will contain the function nearest to your mouse location, The right hand window shows the location of the control points with blue circles. The mouse help at the bottom of the screen guides your mouse motions. MB1: Edits the nearest velocity function. This edit function will appear in the right window in red. As you move your mouse, the blue function will still reect the function nearest to your mouse location. In this way, you can compare two functions. To freeze a blue function you can use MB2. Move your mouse to the right window and activate the Edit Function Icon. This lets you add/move/delete the red function locations marked by the circles. Use the mouse button helps at the bottom of the screen as a guide. MB3: Delete all points at a function location, and hence delete the function. Shift MB1: Adds a new function at a certain location. Anther way to think of this is to Freeze the blue curve on a function that you like with MB2 and edit the questionable function with MB1. When you press UPDATE with your new velocity function, you will see its effect on the entire velocity eld. If you dont like your changes, use the Modify/Undo button to remove the old function. 2. Move from line to line and change the display from inlines, crosslines and time slices. Hand edit the major discontinuities.
Landmark
15-7
Velocity Field Gridding and Smoothing 1. Select the Modify Smooth Velocity eld pull down menu to smooth the RMS velocity eld.
The rst two entries ask about the sampling of the new smoothed eld. We can enter values that are the same as our input eld. Crossline Sampling Interval of 20 Inline Sampling Interval of 10
The time sampling is up to the processor and how complex the velocity field is as a function of time. Our field is fairly well behaved with no inversions and a relatively linear increase as a function of time. We can resample our field at 200 ms intervals without any problems. Time Sampling Interval of 200 ms
The smoothing parameters may also need to be modified.. Normally you would measure the anomaly size (in CDPs) that you want to smooth through on the inline or crossline displays and input these values. For our purposes values of around 20 inline and crosslines with about 200 msec of smoothing should be adequate Crossline smoothing Operator Length of 20 Inline Smoothing Operator Length of 20 Time Smoothing Operator Length of 200
15-8
Landmark
2. Click OK. 3. Review the smoothing operation by looking at inlines, crosslines and time slices. 4. If the smoother was too harsh you can use the Modify Undo last change pull down, reset the parameters and repeat the process until satised. 5. Save this velocity eld to disk using the File Save table to disk pull down menu. This will save the edited and smoothed RMS velocity eld for FK migration.
Convert to Interval Velocity 1. Select the Modify Convert RMS to Interval Velocity pull down menu. There are two choices, Constant Velocity Dix or a Smoothed Gradient Dix conversion. For our purposes in making and interval velocity vs. time function we will choose the Smoothed Gradient method. 2. Review some inlines, crosslines and time slices after the conversion and see if any additional smoothing or editing is required. 3. Use the File Save table to disk and exit pull down menu to save this table to disk and exit the program. We now have two velocity elds: A smoothed RMS eld for FK migration and A smoothed Interval Velocity eld for Phase Shift 3D migration.
Landmark
15-9
15-10
Landmark
Chapter 16
Migration
The ProMAX 3D migrations include poststack time and depth migration algorithms. The available migrations are F-K, Finite Difference (FD), and Phase Shift. The goal is to migrate the stack section with the most appropriate migration process. To aid in selecting the appropriate migration, this chapter includes a comparison of the migrations and a brief description of each. The ProMAX 3D Reference manual and the online help system also provide additional detail about the migrations.
Landmark
16-1
3D Migration Summary
The choice of poststack migration process can be difficult. You must weigh CPU time, accuracy of the velocity model, steepness of dip to be imaged, and other factors in choosing the most appropriate process. Often, a number of different migrations must be run in order to compare results. To help you decide on the optimal migration for a given situation, the migrations are summarized. Migration Name Stolt 3D Phase Shift 3D Type F-K Domain Velocity Time V(x,y) V(t/z) Poor Good Excel Excel Good Dip Fair Excel Good Good Excel Rel. Time 0.9 1 13 18 5
Some of the 3D migrations provide two important restart options. The first option is activated by choosing to checkpoint the process, which will regularly save migration workfiles to disk. In the case that your migration process is abnormally terminated, the migration can be restarted from the last checkpoint. The second option is activated by choosing to save data at a specific depth for restart. In this case, data may be migrated down through the current, reliable velocity information and then subsequently be continued with new velocity information below. Data input to these migrations must be corrected to a flat datum. If your data is referenced to a floating datum, you will need to complete the application of datum statics to move your data to a flat datum. If your velocity field is referenced to a floating datum, you can modify the velocity field with Velocity Manipulation. Also, the stacked data must be sorted with the primary sort of inline and the secondary sort of crossline. Use the Pad 3d Stack Volume process to pad the stacked data using ILINE_NO as the primary sort. The padded traces should be sorted with the primary sort of inline and the secondary sort of crossline.
16-2
Landmark
With all 3D Migrations, you should be aware of the potential need for extended scratch space. How much scratch space a particular migration will use may be determined in the View file. When running 3D Migrations in parallel, certain conventions should be followed for naming scratch space on these machines. Refer to the Extended Scratch Space section in the System Administration manual for a complete description of the extended scratch space setup and requirements.
Stolt 3D Migration Stolt migration is computationally efficient, but has difficulty imaging steep dips in areas where there are large horizontal and vertical velocity variations. This algorithm uses Stolts (1978) stretching technique to account for horizontal and vertical velocity variations. The F-K process requires RMS velocities as input and migrates common offset or stacked data. It is our fastest migration algorithm. Velocity variations are compensated for via the Stolt stretch. This algorithm does not accurately handle strong vertical or horizontal velocity variations.
Phase Shift Migration The Phase Shift migration process uses an interval velocity vs. time field. It can migrate dips greater than 90 degrees (turning rays) and, unlike the 2D equivalent, this 3D migration can handle lateral velocity variations to a limited extent utilizing a modified stretching technique. The primary advantages of this approach are speed and accurate handling of high dips.
PSPC 3D Depth Migration The PSPC Depth migration process uses a spatially-variant interval velocity function in time, VINT(x,y,t). Vertical velocity variations are handled very well by this algorithm. Spatial velocity variations are accommodated with a first-order phase correction, applied to phaseshift migrated data. A phase-shift interpolation option is included for increased accuracy. With this option, the first-order phase correction is applied to the migrated data corresponding to the closest approximation to the required velocity value. The primary advantages of this approach are relative speed, accurate handling of high dips, and good compensation for spatial velocity variations.
Landmark
16-3
Explicit FD 3D Time Migration This algorithm uses explicit F-XY spatially-variant extrapolators to perform time migration. This migration is designed to be accurate up to approximately 70 degrees of dip. This migration uses a vertical and spatially-variant interval velocity field in time, VINT(x,y,t), for input. To reduce run times for this algorithm you may specify a maximum dip of either 30 or 50 degrees, rather than the default of 70 degrees. Run times are dependent upon the maximum frequency for migration, so choose this value accordingly. A further option to enhance performance is to select the Split option, for two-pass migration, instead of the Full 3D option, for one-pass migration.
Explicit FD 3D Depth Migration This is an algorithm which uses explicit F-XY spatially-variant extrapolators to perform 3D depth migration. This migration is designed to be accurate up to approximately 70 degrees of dip. This migration uses a vertical and spatially-variant interval velocity field in depth, VINT(x,y,z), for input. You can choose from 30, 50, and 70 degree options of which the higher maximum dip angles have longer run times. A further option to enhance performance is to select the Split option, for two-pass migration, over the Full 3D option, for one-pass migration. The primary advantages of this approach are efficiency and good handling of vertically-variant velocities and moderate dips, and fair handling of spatial velocity variations. Trace padding should be specified to reduce wrap-around effects in the frequency domain. Values in the range of 30 to 50 percent are generally adequate for normal amplitude-balanced datasets. Explicit FD 3D migration requires that the trace spacing of the input data is equal in the inline and crossline directions. If this is not the case in your 3D survey, use one of the trace interpolation techniques available in ProMAX. Create a new line and run 3D Poststack Geometry on the interpolated dataset to create a CDP database and have the appropriate trace spacings entered in the LIN database.
16-4
Landmark
Landmark
16-5
Velocity Manipulation*
Type of velocity table to input --- Interval Vel in Time Get velocity table from database? -----------------Yes Select input velocity database entry -- for phase shift Combine a second velocity table with the rst ---- No Resample the input velocity table(s)? ---------------No Shift or stretch the input velocity table?----------- No Adjust the velocities to nal datum ---------------- Yes Type of parameter table to output --- Int Vel in Time Select output velocity database entry ---------------------------------------------------- for ps mig - at datum Output a single average velocity table? ------------ No Vertically resample the output velocity table? ---- No Adjust output velocities by percentages? ---------- No
3D Velocity Viewer/Editor
Select the type of eld to edit - Interval Velocity in Time Select input velocity database entry ------------------------------------------------------ for ps mig - at datum You may elect to view the input and output fields using the 3D Velocity Viewer/ Editor.
16-6
Landmark
3D Migration Exercise
1. Build the following ow by copying the DMO stack ow:.
Landmark
16-7
4. Select the Interval Velocity vs. Time function that we generated specically for Phase Shift migration using the velocity editor. 5. Set the frequency range to start at 0 and go to 80 Hz. This is a reasonable range for this dataset. You could improve the performance of the migration by reducing the frequency range. 6. Leave the pad parameters at 0 traces. Normally you would want to compute the migration aperture and add enough traces to prevent the energy from wrapping from one side of the stack to the other. In this case, we are just interested in getting the ow to quickly run. So you can expect to see some spatial wrap on the output section. 7. Set the top taper to 100 ms and the bottom taper to 100 ms. Since you have applied the AGC prior to the Stack DMO, our dataset is fairly well modulated in amplitude and you do not require long tapers. 8. Reapply the original mutes and rekill any traces that were originally dead. 9. In Trace Display Label, label your dataset. 10. In Disk Data Output, output your dataset. 11. Execute the ow. 12. When complete, you have a new stack volume. You can compare this volume to previous volumes using the display comparison flows that were built earlier.
NOTE: Make sure your $PROMAX_HOME/etc/pvmhosts le is set up correctly. This is the rst thing to check if a migration fails to run.
16-8
Landmark
Chapter 17
Landmark
17-1
You will load this single file to provide the SIN and TRC spreadsheets with data. You will then continue with binning, using the Calc-Dim option. This marine 3D survey was collected using a single source / single cable geometry.
Using the Marine 3D Geometry Spreadsheet 1. Make a new line, called 3D Marine. 2. Build the following ow:
Enter the directory name as described by your instructor and then click OK. 3. Choose the 3d_marine_ukooa le from the list.
17-2
Landmark
4. From the Format pulldown menu, open a list of saved formats and choose STANDARD UKOOA 90 Marine 3D. Separate the windows.
5. Check the column denitions by clicking on the words in the Parameter column. Notice that there are two column denitions: One for the R cards one for the S and V cards
Also note that, if desired, the coordinates can be altered using the Math Op and Op Value columns.
Landmark
17-3
6. Select Apply and then Overwrite to apply the format to all the data.
While the import is running, you will see a variety of Status windows. Eventually you will see a Successfully Completed window. 7. Quit from each of the column denition windows and select the File Exit from the main import window. 8. From the main menu click Setup and input the following information: 25 m receiver station interval 25 m source station interval 50 m sail line (crossline) interval Set the azimuths to 0o for the shots and receivers (the correct azimuth will be determined later).
9. Click OK. 10. Generate a basemap of the project by opening the Sources Spreadsheet and selecting View View All Basemap.
17-4
Landmark
Determine Primary Azimuth for Binning 1. Use the Double Fold icon to measure the azimuth of the shot lines.
You should measure a value of approximately 32 degrees East of North. 2. Go back the File Setup window and enter the 32o azimuth for both the shots and receivers.
Landmark
17-5
Cable Feather QC Using the same Basemap you can generate a quick QC display showing the cable feather for individual shots or for entire shot lines. 1. Click on the Highlight Contributors to Trace Domain Icon and then follow the mouse button helps to display the cables for the shots
2. Click MB1 on any shot and its cable will highlight. 3. Click MB2 near a shot and all shots on the same shot line will highlight. 4. Click Shift-MB2 to clear the screen. 5. Repeat as desired. 6. Exit from the XYGraph selecting File Exit Conrm.
17-6
Landmark
2. In the 3D Binning and QC window, select Assign midpoints by: Existing index number mappings in the TRC and click OK.
Note:
You are assigning midpoints based on existing index number mappings in the TRC. You loaded the x,y positions for each shot and each receiver for each shot from the UKOOA file
Landmark
17-7
3. Click the Binning checkbox, select Bin midpoints for Binning type and click OK to open the bin denition window.
Grid Constants
4. Give the Grid a name and set the following grid constants: Grid Azimuth = 32, dx=50, dy=12.5,
17-8
Landmark
5. Click Calc Dim. A grid is automatically calculated to include all midpoints.The origin of the grid is computed as well as the extents of the grid parallel and perpendicular to the azimuth.
Calculated Values 6. Click OK to dismiss the notication window. 7. Click Save to save the grid.
Landmark
17-9
QC the Calculated Grid 1. Select Dene binning grid from the main binning window and click OK.
This will bring up a XY Graph display window. 2. Select Display Midpoint Control Points Black (depending on the color of the background).
17-10
Landmark
Subsurface Point Scatter 3. Select Grid Open and the grid name that you saved from the Calc Dim operation.
Landmark
17-11
Overlay the Grid and the Shot Locations 4. You may elect to display shot locations by selecting Display Source Control Points White.
Interactive Grid QC and Alteration 1. You may elect to alter the grid by using any of the interactive grid editing icons if desired. You may choose to have one subsurface line for each surface sail line. In this case you may elect to turn off the midpoint and shot plots and redisplay the shots only in black. Views Remove Shot based Posting of Position
17-12
Landmark
Views Remove Midpoint based Posting of Position Display Source Control Points Black Grid Display 2. Next, Zoom in on an area and reposition the proposed CDP binning grid so that the centers of the Grids follow the sail lines and so that the shot locations coincide with a CDP bin centers.
One CDP Line per Sail Line 3. It may help to delete one inline from the calculated grid and then adjust the inline extents by redisplaying the Midpoint Control Points (Display Midpoint Control Points Black).
Landmark
17-13
4. You may end up with a nal CDP bin Grid similar to that shown in the following diagram:
5. When satised with the CDP Grid make sure that you save it before exiting from the XYgraph. Select Grid Save to and enter a new grid name in the dialog box and click OK:
17-14
Landmark
Load Final Grid and Perform CDP Binning 1. Return to the 3D Marine Midpoint Binning Window and click Load to apply the nal grid parameters to this menu.
2. Select the bin space name that was saved in the XYgraph session 3. Click OK.
Critical Parameters During CDP Binning Even if you know that you are going to run the Flex Binning processes prior to Velocity Analysis, Stack, DMO and Migration, it is very important to get the conventional CDP binning and Offset Binning Parameters correct.
Landmark
17-15
The CDP Binning parameters, even after Flex Binning, still control how many lines and cross lines exist for the project. The often overlooked parameters pertaining to offset binning are extremely important in the case of Flex Binning. In the Flex Binning assignment one of the most critical parameters pertains to the number of traces per offset bin that should contribute to each CDP. Typically, this will be set to 1 in order to stabilize the offset contribution to each CDP. If the original offset binning is not done correctly, then there is no way to stabilize the Flex Binning output. The goal of the Offset Binning is to achieve one trace per CDP per offset bin, the same requirement as for DMO processing. For a typical marine case you would specify the offset bin increment as twice the shot interval. In this case the shot interval and the group interval are the same at 25 meters which means an offset bin width of 50 meters.
25 mt group intv 3182 mt near offset = 207 next offset = 232 maximum offset = 3182 mt x
207 mt rst bin center = 219.5 minimum offset to bin = 194.5 maximum offset to bin > 3219.5 offset bin increment = 50
3182 mt
maximum offset to bin > 3219.5 use 3300 ft offset bin increment = 50
17-16
Landmark
219.5
CDP Binning Parameters for Marine 3D In this case we will use offset bins that have bin centers at 50 meter increments with a near offset bin center at 219.5 meters and a far offset of 3300 meters. You can use a display from the database to QC these parameters after the final binning step. If you plot a 3D: XYgraph, from the TRC order and plot OFFSET in X, CDP in Y and color code by OFB, you can see the offset distributions on the CDP gathers. After some selective zooming you can overlay the proposed offset binning grid for QC. You may also find that using the contrast.rgb color table in the $PROMAX_HOME/port/misc directory will be useful.
Landmark
17-17
Zoom of Offset vs CDP plot with Offset Bins Overlay You will also notice on this plot that in areas there are duplicate offsets at given CDPs thus making it impossible to reach the goal of 1 trace per CDP per bin. 4. Make sure that Inlines parallel to grid Y axis is selected. 5. Click Apply.
17-18 ProMAX 3D Seismic Processing and Analysis Landmark
After the Binning is complete, click Cancel on the 3D Marine Midpoint Binning window.
Receiver Binning This step is optional. It is only required if you intend to run surface consistent processing such as surface consistent deconvolution or residual statics. 1. Click the checkbox for Binning, select Bin the receivers and click OK. 3 1 5 2 4
2. Load the information from the CDP grid as a starting point. 3. Change the Y dimension of the receiver grid to match the group interval of 25 meters. 4. Enter a name for the Bin space such as receiver grid 32 degrees 50 by 25. 5. Automatically compute the extents of the receiver grid by clicking Calc Dim. 6. Perform the binning by clicking Apply. A Binning receiver locations.... window appears. 7. Select Cancel on the 3D Marine Receiver Binning window when the binning is complete.
Landmark
17-19
QC the CDP Binned Data using a Fold Plot 1. Select QC Midpoint Bin data, Coordinate Space Fold display, then select your bin grid from QC Bin Space list and click OK.
2. When complete, click Cancel on the 3D Binning and QC window. 3. Exit the Spreadsheet window by selecting File Exit.
17-20
Landmark
Inline and Crossline Overlap Factors The Inline and Crossline overlap factors dene the total search radii for all traces that may contribute to an output CDP. The input values are in numbers of original CDP bins.
Landmark
17-21
Crossline Direction
Inline Direction Inline and Crossline Overlap Factors Typical values for these parameters would be 1 in the inline direction and somewhere between 2 and 5 in the crossline direction. For our case we will run with 3 in the crossline direction which means that we will search the current line and 1 line on each side for contributors. 3. We will allow 1 trace per offset bin. 4. Allow traces with zero weight to contribute. This means that if, for some reason, the only trace that is available for a particular offset bin in a CDP has a weight of zero, use it anyway. 5. Do NOT limit the Flex Binning to a subset of the survey. Process the entire project.
17-22
Landmark
Inline and Crossline distance Weighting 6. Parameterize the Crossline and Inline weighting. Typically, these weighting functions will be offset variant. We will generally want to keep the near traces from the original stack track but we will generally need to change the weight function parameters for the far offsets.
1 inline direction
53 0 1 XLine --> offset: distance - weights 0: 0-1, 53-0 Inline --> offset: distance - weights 0: 0-1, 6.25-0 1 3000: 0-1, 75-0 0 75
Azimuth Weighting 7. Parameterize the Azimuth weighting. Again you may elect to vary the azimuth weights as a function of offset where you would weight the traces with the prime sail line
Landmark
17-23
azimuth higher than others. You may nd that you will have to open the weight range for the far offsets where feathering is greater.
Pass Azimuth 1
Pass weight
Reject weight
Pass weight if Reciprocal Traces = Yes, otherwise reject weight Lineraly tapered weight if Reciprocal Traces = Yes, otherwise reject weight
Azimuth Weighting Schematic for our example you may use 3 different weighting schemes for different offset, azimuths pairs: 0:30,34,1.0,0.0,1 / 3000:22,42,1.0,0.0,3 This will give us +/- 2 degrees at the near offsets, and +/- 10 degrees at the far offsets. Note also that we are increasing the taper length from 1 to 3 degrees as the offset increases.
17-24
Landmark
In the marine case we may elect to weight traces by a single sail line. In this case all of the traces that contribute to the CDP line are examined by their S_line trace header word. The sail line with the highest number of contributors is the prime sail line and traces that come from this sail line have the highest weight. You may elect to put an offset variant weight function based on the dominant sail line represented in the traces. In this example we will use an offset variant weighting function that weights the sail line highly for the near offsets and relaxes the weighting function toward the far offsets: 0: 1.0, 0.1/ 3000: 1.0, 0.9 9. Set the number of user-dened rules to NONE. You will have to cycle from ONE through FIVE back to NONE. 10. Use the default of No so as not to request a verbose printout. 11. Execute the ow.
Landmark
17-25
QC Plots
The Assign CDP Flex Binning process writes a number of values to the CDP Order Parameter (database) Files. You may elect to generate QC plots using either DBTools (the default) or XDB Database Display. Try some from each and decide which works best for you. In DBTools, simply double-click the attribute to generate the display. XDB Database Display allows you to view more than one attribute on the same display (Database XDB Database Display, and then Database Get). The values that are available for QC are: CDP: GEOMETRY: FLEXFOLD
The number of traces contributing to each CDP after flex binning. One of the goals of flex binning is to provide uniform fold. The uniformity of this value indicates how well the flex binning worked. CDP: FLXBINQC: MINOFF CDP: FLXBINQC: MAXOFF
If the short or long offsets are missing from a flex binned CDP, these values can be too high or too low, respectively. CDP: FLXBINQC: MEANOFF CDP: FLXBINQC: RMEANOFF
These values can show if long or short offsets are missing. At CDPs where the values are high, short offsets are missing. If the values are low, long offsets are missing. The expected mean offset is half the sum of the first offset bin center and the last offset bin center. Ideally, RMEANOFF, the ratio of the mean to the expected mean, should be 1. In our case we specified 50 as our near offset and 3225 as our far offset. This yields a predicted mean of 1725 meters. CDP: FLXBINQC: STDDOFF CDP: FLXBINQC: RSTDDOFF
These plots illustrates missing offsets if they happen to occur without changing the mean offset. The ratio of standard deviation to expected standard deviation will be 1 for an even offset distribution. If the ratio is less than 1, short and/or long offsets are missing. If it is greater than 1, middle offsets are missing.
17-26
Landmark
Indicates how well the Apply azimuth weighting rule in Flex Binning worked.
Produce QC plots from the database 1. Open the Database and use DBTools to generate the following pairs of displays. Fold: View 2D Matrix CDP order: Xcoord, Ycoord, FOLD, CDP (or use the predined plot) View 2D Matrix CDP order: Xcoord, Ycoord, FLEXFOLD, CDP
Offsets View 2D Matrix CDP order: Xcoord, Ycoord, MINOFF, CDP View 2D Matrix CDP order: Xcoord, Ycoord, MAXOFF, CDP
Mean Offsets View 2D Matrix CDP order: Xcoord, Ycoord, MEANOFF, CDP View 2D Matrix CDP order: Xcoord, Ycoord, RMEANOFF, CDP
Standard Deviation of Offsets View 2D Matrix CDP order: Xcoord, Ycoord, STDDOFF, CDP View 2D Matrix CDP order: Xcoord, Ycoord, RSTDDOFF, CDP
Landmark
17-27
Notice that by using the recommended parameters we have done a reasonable job of stabilizing the fold and offset distributions for all the CDPs.
CDP Contribution and Null QC There is one more set of QC plots that might be useful. We already know that we have good offset distribution and fold but we dont know how many traces we have used more than once and how many we have thrown away. 2. Using XDB Database Display, generate 2D (simple) plots of the FLEXCDP#1, #2 and #3 attributes from the Trace database. Any trace that is NULL for FLEXCDP#1 did not contribute to any CDP. Non NULL CDPs in #2 and #3 contributed to more than 1 CDP.
17-28
Landmark
X o O X o O
We do not have the trace data for this example so we cannot run Expand Flex Binning.
Landmark
17-29
17-30
Landmark
Chapter 18
Landmark
18-1
153-154
shots 5000150153
Receivers 8001-8154
Note the numbering sequence as described on the basemap. In this case we have single station numbers that can be divided into line and station numbers. We will choose this option in order to make the pattern management easier. For the cable stations we will divide the 1001 - 1154
18-2
Landmark
stations into line 1 stations 1 - 154. The remaining receiver lines will be handled similarly. We will also divide the shot stations 10001-10153 into line 10 stations 1 - 153. The remaining shot lines will be handled similarly as well. This exercise will present a couple of logistical problems in how to handle survey files and also how to roll the spread on and off at the end of the swaths using a single pattern definition per swath.
440 110
110
Landmark
18-3
Prepare the Line and run the Spreadsheet 1. Make a new line, called 3D Land 5 swath zig-zag. 2. Build the following ow:
5. Click OK.
Receivers Spreadsheet 1. Open the Receivers Spreadsheet. 2. Use the File Import pulldown to open the ASCII le import window.
18-4
Landmark
3. Use the File Open pulldown menu to select the 5swath_recs le from the directory that your instructor describes for you.
4. Using the Format pull down enter an new columnar format description such as 5 swath receivers. 5. In the Column Import Denition window click Station in the parameter column and then paint the 3 columns that we will use as the station numbers.
Note: We are splitting the station number into two numbers, one for the line and the remaining for the station along the line.
Landmark
18-5
6. Select the remaining column denitions for Line, and the X and Y coordinates. Filter the file and remove unwanted comment cards 7. Click Filter and then respond to the window asking you to delete any card that does not match the columnar format that was dened.
8. When the ltering is complete you should get a Filtering Complete window and there should be a card at the top of the le that says Ignore Record for Import.
18-6
Landmark
9. Click OK to dismiss the window. 10. Select Apply the format and Overwrite the values with the new import values.
11. Click OK and the receivers spreadsheet should become populated with the selected values.
Sources Spreadsheet 1. Open the Sources Spreadsheet by clicking Sources in the main spreadsheet menu window. 2. Select the File Import and File Open pulldown menus and select the 5swath_shots le from the same directory where you found the receivers le. 3. Using the Format pull down enter a new columnar format description such as 5 swath shots. 4. In the Column Import Denition window click Source in the parameter column and then select all the columns used to store the source numbers.
Landmark
18-7
Note: We are splitting the station number into two numbers, one for the line and the remaining for the station along the line.
5. Click Line in the parameter column and then select the rst two columns to be used as the line number(include a blank before the rst column)..
6. Complete the column denitions for the Station and X and Y coordinates. 7. Click Filter to ignore any unwanted cards. 8. Select Apply the format and Overwrite the values in the database. The Sources spreadsheet should now be populated with the selected information.
18-8
Landmark
9. Generate a Basemap from either the Sources or Receivers spreadsheet using the View View All Basemap pulldown menu.
10. Use the Cross Domain Contribution (Double Fold) icon MB3 function to measure the Azimuth of the cable lines. You should measure approximately 25 degrees.
Landmark
18-9
Click Setup from the main menu and input 25 degrees for the shot and receiver azimuth.
Note: In this mode the assignment mode is set to use the method of matching pattern number in the SIN and PAT spreadsheets. This is correct since we did not import patterns and did not run an extraction. In this case we will have to specify patterns in the Patterns Spreadsheet.
Patterns Spreadsheet As an example we will define a pattern that is typical for a swath shooting geometry. We will define a basic bi-symmetric split geometry where we will have for any given shot 4 live cables and 60 traces on each cable. The shots will be between the center two cables and between traces 30 and 31 on each cable. There will be no gap in the split spread. 1. Open the Patterns Spreadsheet by clicking Patterns in the main spreadsheet menu window. Two windows will appear.
2. In the smaller window specify that there are a maximum of 240 traces per shot and that the number of traces per shot varies. 3. Click OK in the smaller window to dismiss it. 4. We can now specify the rst pattern. Since we are using a line/ station relationship we will need a separate pattern for each swath.
18-10
Landmark
For the First pattern mark a block of 4 cards and then ll the columns as shown in the next diagram.
Fill Pattern starting at 1 and increment of 1 Fill Chan From starting at 1 and increment by 60 Fill Chan To starting at 60 and increment by 60 Fill Chan Inc starting at 1 and increment by 0 Fill Rcvr Line starting at 1 and increment by 1 Fill Rcvr From starting at 1 and increment by 0 Fill Rcvr To starting at 60 and increment by 0 Fill Rcvr Inc starting at 1 and increment by 0
5. Copy this pattern 4 times (one time for each remaining swath) and then change the cable numbers to match the pattern numbers on a per swath basis. Use the Edit Copy pull down to copy the pattern to a new one.
6. Exit from the Patterns Spreadsheet selecting File Exit from the pulldown.
Landmark
18-11
Complete the Sources Spreadsheet Now that the patterns have been defined we can assign each shot to use the appropriate pattern and then SHIFT the pattern for each shot. 1. Return to the Sources Spreadsheet. 2. You may elect to reorder the columns of the spreadsheet so that the pattern and pattern shift cards appears near the Line and Station columns for convenience. Use the Setup Order pull down and then click on the column headers in the order you want the to appear. Use MB2 on the last column heading of interest. 3. For all of the shots in the rst swath (on line 10) we will use pattern number 1 and then we will shift the pattern by -29 for the rst shot and increment the pattern shift by 1 for each shot. 4. Complete the pattern number and pattern shift entries for all shots in all 5 swaths using multiple Find and Fill operations. 5. When complete exit from the Sources spreadsheet using the File Exit pulldown.
Trace Assignment This exercise illustrates CDP binning procedures. For this example we will automatically compute a CDP grid based on some initial known values and then apply the grid using the batch CDP Binning* process.
18-12
Landmark
1. In the main menu, click Bin. A submenu appears with options for Assigning the traces to midpoints, dening the bin grid, binning the data, quality controlling the binning, and nalizing the database.
2. Select to Assign midpoints by Matching pattern number in the SIN and PAT spreadsheets, and click Ok. In this case the Assignment step is performing the following calculations: Computes the SIN and SRF for each trace and populates the TRC OPF Computes the Shot to Receiver Offset (Distance) Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
Landmark
18-13
An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click Proceed.
A number of progress windows will ash on the screen as this step runs. A nal Status window should notify you that you Successfully completed geometry assignment. Click Ok. If this step fails, you have an error in your spreadsheets somewhere. Not much help is given to you, but the problems are usually related to the spread and/or pattern denitions.
18-14
Landmark
Spread QC after Trace Assignment 1. Open the Receiver Spreadsheet and generate a basemap using the.View View All Basemap pull down menu.
2. Use the Cross Domain Contribution (Double Fold) icon MB1 and MB2 functions to view which receivers have been dened to be live for each shot and also to see which shots contribute to each receiver. You should observe a symmetric split spread of four cables that rolls on and off the spread at the ends of the swath. 3. Exit from the XYgraph and the Spreadsheet using the File Exit Conrm and File Abort pull down menus respectively.
Landmark
18-15
Automatic Bin Calculation and QC 1. Select Bin midpoints and click Ok. You should get the following window:
2. Set the Azimuth=25, Grid Size in X = 55, Grid size in Y=55, Bin Space Name, Minimum offset to Bin=0.0. Offset Bin Increment =110 and select to select Inlines to be parallel to grid Y axis, which is parallel to the dened azimuth. In our case, this is parallel to the cable.
18-16
Landmark
3. Click Calc Dim. The Calc Dim operation computes the origin of the grid and the Maximum X and Y dimensions.
4. Save the grid denition by clicking Save. 5. Click Cancel to exit this window.
Landmark
18-17
QC the Calculated Grid 1. Select Dene binning grid from the main binning window and click Ok.
This will bring up a small map window. 2. Select Display Midpoint Control Points Black (depending on the color of the background).
18-18
Landmark
Mid-point Scattergram for CDP Binning 3. Click Grid Open and select the grid name that you saved from the Calc Dim operation. This step overlays the bin grid on your subsurface data.
Landmark
18-19
Build Geometry from SPS les for Land 3D 1. Make a new line called 3D Land SPS input example. 2. Build the following ow:
6. From the Format pulldown menu, open a list of saved formats and choose STANDARD SHELL SPS Land 3D.
19-4
Landmark
7. Check the column denitions by clicking on the words in the Parameter column. Notice that there are two column denitions: one for the S and R cards, and one for the X cards.
Also note that, if desired, the coordinates can be altered using the Math OP and Op Value columns. 8. Click Apply and then select Overwrite all the data. Click OK.
While the import is running, you will see a variety of Status windows. Eventually you will see a Successfully Completed window.
Landmark
19-5
There are still two more files to read. We have read the R file but still need to read the S and X files. 9. Use the File Open pulldown menu from the UKOOA File Import window and select sps.s le. 10. Click Apply and select Overwrite all the data. Click OK. 11. Use the File Open pull down menu from the UKOOA File Import window to access the sps.x le. 12. Click Apply and select Overwrite all the data. Click OK. 13. Quit from each of the column denition windows and select File Exit from the main import window.
Setting Project Constants 1. From the main menu, click Setup and input the following information: 50 m receiver station interval 60 m source station interval 360 m crossline separation The source stations are not based on the receiver station numbers. This data were recorded using a surface source and the measurement system is metric. We will measure the azimuths on a basemap generated from the receivers spread sheet.
Note: Note that the Assignment mode is set to the third option of Matching line and station numbers in the SIN and PAT spreadsheet This mode is generally reserved for SPS input where every shot gets a separate pattern dened for it.
2. Click Ok.
19-6
Landmark
3. Generate a basemap of the project by opening the Receivers Spreadsheet and selecting View View All Basemap.
Determine Primary Azimuth for Binning 4. Use the Double Fold icon to measure the azimuth of the receiver lines.
5. Click Setup and enter the 97.5o azimuth for both the shots and receivers. 6. Exit from the Setup window by clicking OK. 7. Exit from the Receivers Spreadsheet by selecting File Exit. 8. Exit from the XYgraph using File Exit Conrm.
Trace Assignment This exercise illustrates CDP binning procedures. For this example we will automatically compute a CDP grid based on some initial known values and then apply the grid using the batch CDP Binning* process.
Landmark
19-7
1. In the main menu, click Bin. A submenu appears with options for Assigning the traces to midpoints, dening the bin grid, binning the data, quality controlling the binning, and nalizing the database.
2. Select to Assign midpoints by using Matching line and station numbers in the SIN and PAT spreadsheets, and click Ok. In this case the Assignment step is performing the following calculations: Computes the SIN and SRF for each trace and populates the TRC OPF Computes the Shot to Receiver Offset (Distance). Computes the Midpoint coordinate between the shot and receiver. Computes the Shot to Receiver Azimuth.
19-8
Landmark
An Assignment Warning window will pop up warning that some or all of the data in the Trace spreadsheet will be overwritten. Click Proceed.
A number of progress windows will ash on the screen as this step runs. A nal Status window should notify you that you Successfully completed geometry assignment. Click Ok. If this step fails, you have an error in your spreadsheets somewhere. Not much help is given to you, but the problems are usually related to the spread and/or pattern denitions.
Spread QC after Trace Assignment 1. Open the Receiver Spreadsheet and generate a basemap using theView View All Basemap pulldown menu.
2. Use the Cross Domain Contribution (Double Fold) icon MB1 and MB2 functions to view which receivers have been dened to be live for each shot and also to see which shots contribute to each receiver. You should observe a symmetric split spread of four cables that rolls on and off the spread at the ends of the swath.
Landmark
19-9
3. Exit from the XYgraph and the Spreadsheet using the File Exit Conrm and File Abort pulldown menus respectively.
Automatic Bin Calculation and QC 1. Select Bin midpoints and click Ok. You should get the following window:
2. Set the Azimuth=97.5, Grid Size in X = 30, Grid size in Y=25, Bin Space Name, Offset Bin Increment=50 and select Inlines to be
19-10
Landmark
parallel to grid Y axis, which is parallel to the dened azimuth. In our case, this is parallel to the cable.
3. Click Calc Dim. The Calc Dim operation computes the origin of the grid and the Maximum X and Y dimensions.
Landmark
19-11
QC the Calculated Grid 1. Select Dene binning grid from the main binning window and click Ok.
This will bring up a small map window. 2. Select Display Midpoint Control Points Black (depending on the color of the background).
19-12
Landmark
Mid-point Scattergram for CDP Binning 3. Select Grid Open and the grid name that you saved from the Calc Dim operation. This step overlays the bin grid on your subsurface data. Because of the density of the display a zoom will help show and QC the results. You may elect to alter the grid by using any of the interactive grid editing icons if desired. (There should be no need to alter the grid.) 4. Exit the XYGraph by selecting File Exit Conrm. 5. Close the 3D Binning and QC window by clicking Cancel. 6. Select File Exit from the main spreadsheet menu to exit the Geometry Spreadsheet.
Complete CDP Binning using Batch CDP Binning This exercise completes the CDP binning and database finalization steps. 1. Build and execute the following ow:
CDP Binning*
Binned Space Name ------- your grid This process will perform the CDP binning and Finalization steps in a batch job instead of interactively using the spreadsheet. 2. Once the Binning is complete you can generate the QC plots using the database display tools.
Landmark
19-13
19-14
Landmark