next up previous print clean
Next: The basic low-cut filter Up: FAMILIAR OPERATORS Previous: Causal and leaky integration

Backsolving, polynomial division and deconvolution

Ordinary differential equations often lead us to the backsolving operator. For example, the damped harmonic oscillator leads to a special case of equation (23) where $(a_3,a_4,\cdots)=0$.There is a huge literature on finite-difference solutions of ordinary differential equations that lead to equations of this type. Rather than derive such an equation on the basis of many possible physical arrangements, we can begin from the filter transformation $\bold B$ in (4) but put the matrix on the other side of the equation so our transformation can be called one of inversion or backsubstitution. Let us also force the matrix $\bold B$ to be a square matrix by truncating it with $\bold T = \left[ \bold I \quad \bold 0 \right]$, say $\bold A =\left[ \bold I \quad \bold 0 \right] \bold B = \bold T \bold B$.To link up with applications in later chapters, I specialize to 1's on the main diagonal and insert some bands of zeros.  
 \begin{displaymath}
\bold A \bold y
\eq
\left[ 
\begin{array}
{ccccccc}
 1 & 0 &...
 ...\\  
 x_4 \\  
 x_5 \\  
 x_6
 \end{array} \right] 
\eq
\bold x\end{displaymath} (22)
Algebraically, this operator goes under the various names, ``backsolving'', ``polynomial division'', and ``deconvolution''. The leaky integration transformation (19) is a simple example of backsolving when $a_1=-\rho$ and a2=a5=0. To confirm this, you need to verify that the matrices in (23) and (19) are mutually inverse.

A typical row in equation (23) says  
 \begin{displaymath}
x_t \eq y_t \ +\ \sum_{\tau \gt 0} \ a_\tau\ y_{t-\tau}\end{displaymath} (23)
Change the signs of all terms in equation (24) and move some terms to the opposite side  
 \begin{displaymath}
y_t \eq x_t \ -\ \sum_{\tau \gt 0} \ a_\tau\ y_{t-\tau}\end{displaymath} (24)
Equation (25) is a recursion to find yt from the values of y at earlier times.

In the same way that equation (4) can be interpreted as Y(Z)=B(Z)X(Z), equation (23) can be interpreted as A(Z) Y(Z) = X(Z) which amounts to Y(Z) = X(Z)/A(Z). Thus, convolution is amounts to polynomial multiplication while the backsubstitution we are doing here is called deconvolution, and it amounts to polynomial division.

A causal operator is one that uses its present and past inputs to make its current output. Anticausal operators use the future but not the past. Causal operators are generally associated with lower triangular matrices and positive powers of Z, whereas anticausal operators are associated with upper triangular matrices and negative powers of Z. A transformation like equation (23) but with the transposed matrix would require us to run the recursive solution the opposite direction in time, as we did with leaky integration.

A module to backsolve polydiv is polydiv1. polydiv1deconvolve

The more complicated an operator, the more complicated is its adjoint. Given a transformation from $\bold x$ to $\bold y$ that is $\bold T\bold B \bold y = \bold x$,we may wonder if the adjoint transform really is $(\bold T\bold B)' \hat{\bold x}= \bold y$.It amounts to asking if the adjoint of $\bold y = (\bold T\bold B)^{-1}\bold x$ is $\hat{\bold x} = ((\bold T\bold B)')^{-1}\bold y$.Mathematically we are asking if the inverse of a transpose is the transpose of the inverse. This is so because in $\bold A\bold A^{-1} = \bold I = \bold I' = (\bold A^{-1})'\bold A'$the parenthesized object must be the inverse of its neighbor $\bold A'$.

The adjoint has a meaning which is nonphysical. It is like the forward operator except that we must begin at the final time and revert towards the first. The adjoint pendulum damps as we compute it backward in time, but that, of course, means that the adjoint pendulum diverges as it is viewed moving forward in time.


next up previous print clean
Next: The basic low-cut filter Up: FAMILIAR OPERATORS Previous: Causal and leaky integration
Stanford Exploration Project
4/27/2004