Next: Generalized Burg-type adaptive algorithm
Up: ADAPTIVE BURG-TYPE FILTERING
Previous: Is this algorithm rigorous?
In Burg's algorithm, the backward residuals don't form
an orthogonal basis of the space of the regressors
(
). Effectively, we minimize the
energies of the forward and backward residuals not separately,
but together. Thus, the recursions implied for the backward
residuals do not correspond to a Gram-Schmidt orthogonalization
of the space of the regressors.
Therefore, my idea is to use two reflection coefficients, Krk and
, as in the LSL algorithm, so that the forward and backward
residuals are minimized separately. This is simply done by writing:
|  |
(12) |
Now, to compute the reflection coefficients Krk and
, we
minimize separately the energies of the forward and backward residuals:
![\begin{displaymath}
\left\{\begin{array}
{lllll}
{\cal E}^{\varepsilon}(K^r_k)&=...
...)-K^{\varepsilon}_{k}\varepsilon_{k-1}(t)]^2 \end{array}\right.\end{displaymath}](img89.gif)
This yields to the following time-varying formulations of the reflection
coefficients:
|  |
(13) |
| (14) |
Both coefficients have the same numerator; actually, Burg's coefficient is
their geometric mean. Once again, these coefficients can be easily computed
if we split their numerators and denominators into past and future summations.
Indexing the denominators with r and
as we did with the reflection
coefficients, we have now:
|  |
(15) |
These recursions, joined to the formulas (13) and
(15), give my modified version of Burg's adaptive algorithm.
The advantage of Burg's algorithm is that it is more stable, because the
reflection coefficients are always smaller than one. However, this version
allows me to construct an orthogonal basis for the space of regressors of
the general prediction problem, and thus leads to the corresponding algorithm.
Next: Generalized Burg-type adaptive algorithm
Up: ADAPTIVE BURG-TYPE FILTERING
Previous: Is this algorithm rigorous?
Stanford Exploration Project
1/13/1998