To fill empty bins by optimization, we need a filter. That filter should be a prediction-error filter (PEF). Thus, finding missing data is a two-stage process using linear least squares (LS) at each stage.
Our subroutines for finding the PEF
would have the pitfall that they would use the missing
data to find the PEF, and hence they would get the wrong PEF.
We can use a PEF finder that uses weighted least squares.
Then we can use a weighting function that vanishes on
those fitting equations that involve missing data
(and is unity elsewhere).
The hconest module is a convolution module that
we will use.
Instead of multiplying a convolution output by a zero,
it simply does not compute the convolution output.
module hconest { # masked helix convolution, adjoint is the filter.
real, dimension( :), pointer :: x
logical, dimension( :), pointer :: bad # output depends on bad inputs
integer, dimension( :), pointer :: lag
#% _init( bad, x, lag)
#% _lop( a, y)
integer ia, ix, iy
do ia = 1, size( a) {
do iy = 1 + lag( ia), size( y) { if( bad( iy)) cycle
ix = iy - lag( ia)
if( adj)
a( ia) = a( ia) + y( iy) * x( ix)
else
y( iy) = y( iy) + a( ia) * x( ix)
}
}
}
Suppose that y4 were a missing or a bad data value in the fitting goal (28). That would spoil the 4th, 5th and 6th fitting equations. Thus we would want to be sure that w4, w5, and w6 were zero.
![]() |
(28) |