Next: Real Data Results
Up: Applications to Data Regularization
Previous: Applications to Data Regularization
The frequency-wavenumber log-stretch PS-AMO operator works on
a regularly sampled cube. However, the data is usually recorded
on an irregular mesh. To overcome this obstacle, I will first map
the data that are on an irregular mesh to a regular 5-D grid
with dimensions ().
For migration efficiency I decrease the dimensionality
of the dataset, I reduce the data dimensions by
creating a common-azimuth cube oriented along the inline
direction, that is to eliminate the crossline offset (hy)
axis.
I use the nearest-neighbor interpolation operator ()to map the data from an irregular mesh into a regular mesh.
Then, I use the PS-AMO operator to transform data
from non-zero crossline offsets () to
zero crossline offset (hy=0), I refer to this operation
as an operator which is a summation over hy. I allow some mixing
between hx by expanding our summation to form hx=a and hy=0
as following:
| |
(46) |
where is small.
I combine these two operators to estimate a 4-D model () from
a 5-D irregular dataset () through the objective function,
| |
(47) |
In the context of least-squares inversion a model regularization
term can be included. This regularization could be a derivative
operator ()applied to data cubes that have been transformed to the same offset
using the PS-AMO operator. This regularization assumes that
the reflection amplitudes are a smooth function of the
reflection angle and azimuth. Therefore, this least-squres
problem (equation ) becomes:
| |
(48) |
The adjoint solution for this inverse problem is
| |
(49) |
However, the adjoint solution is not ideal. The irregularity
of our data can lead to artificial amplitude artifacts.
A more accurate solution to this inverse problem is the
weighted adjoint solution.
This solution uses a diagonal operator,
that is a model-space weighting function, to
approximate the Hessian for the least-squares
problem Claerbout and Nichols (1994); Rickett (2001), such that
| |
(50) |
where the diagonal operator is an
approximation to the Hessian
| |
(51) |
where is a tiny value to avoid dividing
by zero.
The solution of problem is not
feasible on a single computer. The computational
requirements are onerous, but potentially manageable.
However, the memory requirements are not.
A full regularized 5-dimensional cube, that I create
after applying ,can easily reach tens of gigabytes. This size of data
makes it almost impossible to practically implement any
algorithm for 3-D prestack seismic data-processing on a single machine.
Clapp (2004) introduces an efficient
python library for handling parallel jobs.
The library makes it easy for the user to take
an already existing serial code and transform it
into a parallel code.
The library handles distribution, collection,
and node monitoring, commonly
onerous tasks in parallel processing.
The main prerequisite to using the python library is to
build an efficient serial code,
and to describe
how the parallel job should be distributed on a cluster.
For this problem I chose to split along the hx axis.
I created a series of tasks, each assigned to produce a single
() volume. Each task is passed
a range of hx's defined by equation . The resulting model
volumes are then recombined to form the regularized 4-D output space.
Next: Real Data Results
Up: Applications to Data Regularization
Previous: Applications to Data Regularization
Stanford Exploration Project
12/14/2006