next up previous print clean
Next: NULL SPACE AND INTERVAL Up: Preconditioning Previous: The preconditioned solver

OPPORTUNITIES FOR SMART DIRECTIONS

Recall the fitting goals (10)  
 \begin{displaymath}
\begin{array}
{llllllcl}
\bold 0 &\approx& \bold r_d &=& \bo...
 ...\bold r_m &=& \bold A \bold m &=&
 \bold I & \bold p\end{array}\end{displaymath} (10)
Without preconditioning we have the search direction
\begin{displaymath}
\Delta \bold m_{\rm bad} \quad =\quad
\left[
 \begin{array}
...
 ... \begin{array}
{c}
 \bold r_d \\  \bold r_m
 \end{array}\right]\end{displaymath} (11)
and with preconditioning we have the search direction
\begin{displaymath}
\Delta \bold p_{\rm good} \quad =\quad
\left[
 \begin{array}...
 ... \begin{array}
{c}
 \bold r_d \\  \bold r_m
 \end{array}\right]\end{displaymath} (12)

The essential feature of preconditioning is not that we perform the iterative optimization in terms of the variable $\bold p$.The essential feature is that we use a search direction that is a gradient with respect to $\bold p'$ not $\bold m'$.Using $\bold A\bold m=\bold p$ we have $\bold A\Delta \bold m=\Delta \bold p$.This enables us to define a good search direction in model space.
\begin{displaymath}
\Delta \bold m_{\rm good} \quad =\quad\bold A^{-1}
\Delta \b...
 ...1} (\bold A^{-1})'
 \bold F' \bold r_d + \bold A^{-1} \bold r_m\end{displaymath} (13)
Define the gradient by $\bold g=\bold F'\bold r_d$ and notice that $\bold r_m=\bold p$. 
 \begin{displaymath}
\Delta \bold m_{\rm good} \quad =\quad
 \bold A^{-1} (\bold A^{-1})' \ \bold g
 + \bold m\end{displaymath} (14)

The search direction (14) shows a positive-definite operator scaling the gradient. Each component of any gradient vector is independent of each other. All independently point a direction for descent. Obviously, each can be scaled by any positive number. Now we have found that we can also scale a gradient vector by a positive definite matrix and we can still expect the conjugate-direction algorithm to descend, as always, to the ``exact'' answer in a finite number of steps. This is because modifying the search direction with $ \bold A^{-1} (\bold A^{-1})'$ is equivalent to solving a conjugate-gradient problem in $\bold p$.


next up previous print clean
Next: NULL SPACE AND INTERVAL Up: Preconditioning Previous: The preconditioned solver
Stanford Exploration Project
4/27/2004