If we are not careful, our calculation of the PEF could have the pitfall that it would try to use the missing data to find the PEF, and hence it would get the wrong PEF. To avoid this pitfall, imagine a PEF finder that uses weighted least squares where the weighting function vanishes on those fitting equations that involve missing data. The weighting would be unity elsewhere. Instead of weighting bad results by zero, we simply will not compute them. The residual there will be initialized to zero and never changed. Likewise for the adjoint, these components of the residual will never contribute to a gradient. Module hconest assumes it knows the bad locations (where outputs are dependent on unknown inputs) and ignores computation there.
module hconest { # masked helix convolution, adjoint is the filter.
use helix
real, dimension (:), pointer :: x
type( filter) :: aa
#% _init( x, aa)
#% _lop( a, y)
integer ia, ix, iy
do ia = 1, size( a) {
do iy = 1 + aa%lag( ia), size( y) { if( aa%mis( iy)) cycle
ix = iy - aa%lag( ia)
if( adj) a( ia) += y( iy) * x( ix)
else y( iy) += a( ia) * x( ix)
}
}
}
#$
#$=head1 NAME
#$
#$hconest - convolution using helix filters, adjoint is filter
#$
#$=head1 SYNOPSIS
#$
#$Initializer - C<call hconest_init(x,aa)>
#$
#$Operator - C<ierr=hconest_lop(adj,add,xx,yy)>
#$
#$=head1 PARAMETERS
#$
#$=over 4
#$
#$=item x - C<real(:)>
#$
#$ data
#$
#$=item aa - type(filter)
#$
#$ helix filter to perform convolution with
#$
#$=item adj,add,xx,yy -
#$
#$ standard operators parameters
#$
#$=back
#$
#$=head1 DESCRIPTION
#$
#$ Masked helix convolution, adjoint is the filter$
#$
#$=head1 SEE ALSO
#$
#$L<helix>,L<nhconest>,L<helicon>,L<polydiv>
#$
#$=head1 LIBRARY
#$
#$B<geef90>
#$
#$=cut
Suppose that y4 were a missing or a bad data value in the fitting goal (27). That would spoil the 4th, 5th and 6th fitting equations. Thus we would want to be sure that w4, w5, and w6 were zero.
| (17) |
.
#$=head1 NAME #$ #$misinput - find a mask of missing filter inputs #$ #$=head1 SYNOPSIS #$ #$C<call find_mask(known,aa)> #$ #$=head1 PARAMETERS #$ #$=over 4 #$ #$=item known - logical(:) #$ #$ Known data locations #$ #$=item aa - type(filter) #$ #$ Helix filter #$ #$=back #$ #$=head1 DESCRIPTION #$ #$Given known locations of data, mark on filter where convolution #$result is valid. #$ #$=head1 SEE ALSO #$ #$L<lopef>,L<helicon> #$ #$=head1 LIBRARY #$ #$B<geef90> #$ #$=cutmodule misinput { # find a mask of missing filter inputs use helicon contains subroutine find_mask( known, aa) { logical, intent( in) :: known(:) type( filter) :: aa real, dimension( size (known)) :: rr, dfre integer :: stat where( known) dfre = 0. elsewhere dfre = 1. call helicon_init( aa) aa%flt = 1. stat = helicon_lop( .false., .false., dfre, rr) aa%flt = 0. where ( rr > 0.) aa%mis = .true. } }
to design a filter, the one-dimensional gradient along the track.
Choose suitable regularization.
The result should be a track-free map of the lake.
If the suspicious data points were all found by the pre-editor,
the spikes should be gone too.
Another description of this problem is at
http://sepwww.stanford.edu/sep/jon/trash/mad/