Up to this point I have imposed only minimal constraints on the RMS velocity, namely those necessary to justify use of the convolutional model. Most velocity analysis imposes far more stringent constraints, either explicitly or implicitly, in the form of parsimonious parametrization or regularization. In the former case, the choice of parameters (eg. how many spline nodes, where to place them) is ad hoc. In the latter, the type of regularization (first derivative, second derivative,...) and the choice penalty weight are also obscure.
In this section I suggest that the differential semblance objective itself supplies a mechanism for constraining the velocity to a parsimoniously parametrized space. I'll propose a choice of subspace within which
Assume until further notice that the data is free of noise:
Now suppose that u* differs from a reference square slowness u0 (in practice, an initial estimate) by a member of a space W. Introduce an inner product in the space of W by
Since the interval velocities, hence the RMS square slownesses, are
supposed to vary over a bounded set in , membership in
entails a bound on the W norm of u-u0.
Let g(t01,t02) be the Green's function for the operator
Next suppose that H[u] is uniformly
positive definite for all . That is, there exist
for which
Then there exists a similar uniform bound for ,since the latter differs from H[u] by a diagonal scaling operator with
uniform upper and lower bounds over
. For the same reason,
That is: within u0+W, u* is the unique stationary point of J0.
Moreover, consulting the estimates of the last section, you see that if the search is limited to u0+W, then at a stationary point u,
Finally, how does one lay hands on such a paragon of a function space as
W with the properties supposed here? The operator H[u] is symmetric positive
semidefinite on . An optimal choice for W is
the direct sum of eigenspaces of
corresponding to the eigenvalues above the
cutoff level h*. A computable estimate of this space is the corresponding
direct sum of eigenspaces of H[u]. A basis consists of eigenfunctions of the
Sturm-Liouville problem
Note that if there is little data in a t0 interval, R[u] will be small in that interval and eigenfunctions of the 4th derivative operator will smoothly interpolate values to either side. Thus my suggested space implicitly ``picks events'' with significant energy, pins the RMS velocity down at those places, and interpolates between ``events'' - just as a human velocity analyst would.
It remains to analyse this ``picking'' effect, and to devise good algorithms
for choosing the eigenvalue cutoff as a function of data quality and success
in fitting moveout (i.e. minimizing J0), so as to justify the assumption
that . But that's another story...