Ordinary differential equations often lead us to the backsolving operator.
For example, the damped harmonic oscillator leads to
a special case of equation
(23)
where
.There is a huge literature on finite-difference solutions
of ordinary differential equations
that lead to equations of this type.
Rather than derive such an equation on the basis
of many possible physical arrangements,
we can begin from the filter transformation
in (4)
but put the matrix on the other side of the equation
so our transformation can be called one of inversion or backsubstitution.
Let us also force the matrix
to be a square matrix
by truncating it with
, say
.To link up with applications in later chapters,
I specialize to 1's on the main diagonal and insert some bands of zeros.
![]() |
(23) |
A typical row in equation (23) says
| |
(24) |
| |
(25) |
The most rudimentary form of mathematics in which
this recursion occurs is in connection with multiplying
and dividing polynomials.
The transformation (23) can be derived
by multiplying two polynomials,
Y(Z) = y0 + y1 Z+ y2 Z2 + y3 Z3 + y4 Z4 + y5 Z5 times
A(Z) = 1 + a1 Z + a2 Z2 +a5 Z5.
Identifying the k-th power of Z in the product
X(Z)=A(Z)Y(Z) gives the k-th row of the transformation
(23).
Thus the operator in (23) is Y(Z)=X(Z)/A(Z).
The polynomials in Z are called Z transforms.
They may be recognized as Fourier transforms
where
.Convolution corresponds to multiplying polynomials
(convolving the coefficients)
and deconvolution (or backsolving) corresponds to polynomial division.
A causal operator is one that uses its present and past inputs to make its current output. Anticausal operators use the future but not the past. Causal operators are generally associated with lower triangular matrices and positive powers of Z, whereas anticausal operators are associated with upper triangular matrices and negative powers of Z. A transformation like equation (23) but with the transposed matrix would require us to run the recursive solution the opposite direction in time, as we did with leaky integration.
A module to backsolve polydiv is polydiv1.
module polydiv1 { # Polynomial division (recursive filtering)
real, dimension (:), pointer :: aa
#% _init ( aa)
#% _lop ( xx, yy)
integer ia, ix, iy
real tt
if( adj)
do ix= size(xx), 1, -1 {
tt = yy( ix)
do ia = 1, min( size(aa), size (xx) - ix) {
iy = ix + ia
tt -= aa( ia) * xx( iy)
}
xx( ix) = xx( ix) + tt
}
else
do iy= 1, size(xx) {
tt = xx( iy)
do ia = 1, min( size(aa), iy-1) {
ix = iy - ia
tt -= aa( ia) * yy( ix)
}
yy( iy) = yy( iy) + tt
}
}
The more complicated an operator, the more complicated is its adjoint.
Given a transformation from
to
that is
,we may wonder if the adjoint transform really is
.It amounts to asking if the adjoint of
is
.Mathematically we are asking if
the inverse of a transpose is the transpose of the inverse.
This is so because in
the parenthesized object must be the inverse of its neighbor
.
The adjoint has a meaning which is nonphysical. It is like the forward operator except that we must begin at the final time and revert towards the first. The adjoint pendulum damps as we compute it backward in time, but that, of course, means that the adjoint pendulum diverges as it is viewed moving forward in time.