Hi Daniel,
Thank you for your detailed response. Indeed, that's what I do now is to
set the Sample Delay equal to: filter group delay divided by an
interpolation factor. One thing I don't understand in your response is:
/"However, I don't think this is a very good solution, because it forces/
/the output delay to be a multiple of the interpolation factor. Typically
we'd like to set the output delay to the FIR group delay, which is
(len(taps)-1)/2. This need not be a multiple of the interpolation factor."/
/
/
But that is the current situation, to compensate for the improper delay
introduced by the FIR filter you have to divide the /Sample Delay /by
the interpolation factor - that's how it looks right now.
Your proposal is correct IMO, and works but current behavior of the
block is misleading and at least should be noted in the documentation or
wiki.
Cheers,
Marcin
niedz., 24 kwi 2022 o 10:45 Daniel Estévez <daniel@destevez.net
<mailto:daniel@destevez.net>> napisał(a):
El 22/4/22 a las 11:32, Marcin Puchlik via GNU Radio, the Free &
Open-Source Toolkit for Software Radio escribió:
> Hello
> I was playing with the Interpolating FIR Filter Block and noticed
that
> the mentioned filter delays tags not properly. What I mean is
that when
> the interpolation factor is different than 1, then the filter
delays the
> tags by the /Sample Delay * Interpolation factor/ value. In my
opinion
> this is not correct behavior. Group delay of the filter is
constant and
> the interpolation factor shouldn't affect the process of delaying
the
> tags. Below, I am attaching the plot and GRC flowgraph which
demonstrate
> the problem and where you can see the observed behavior.
> Number of the taps in RRC filter used in the simulation: 41
> Group delay: 20
> Actual group delay: 40
> What do you think?
Hi Marcin,
I've done a small flowgraph to test this and agree with what you've
found. Looking at how this is handled in code, I see that the sample
delay in the GRC block is used to call the declare_sample_delay()
method
of the block.
https://github.com/gnuradio/gnuradio/blob/main/gr-filter/grc/filter_interp_fir_filter_xxx.block.yml#L46
<https://github.com/gnuradio/gnuradio/blob/main/gr-filter/grc/filter_interp_fir_filter_xxx.block.yml#L46>
This sets the delay referred to the input:
https://github.com/gnuradio/gnuradio/blob/main/gnuradio-runtime/lib/block.cc#L68
<https://github.com/gnuradio/gnuradio/blob/main/gnuradio-runtime/lib/block.cc#L68>
Basically, delays for tags seem to be considered at the input always:
https://github.com/gnuradio/gnuradio/blob/main/gnuradio-runtime/lib/buffer_reader.cc#L138
<https://github.com/gnuradio/gnuradio/blob/main/gnuradio-runtime/lib/buffer_reader.cc#L138>
So I can see why for an interpolator block you would get sample_delay *
interpolation_factor at the output.
A way to fix this would be to modify the GRC block to call
self.${id}.declare_sample_delay(${samp_delay}//${interp})
However, I don't think this is a very good solution, because it forces
the output delay to be a multiple of the interpolation factor.
Typically
we'd like to set the output delay to the FIR group delay, which is
(len(taps)-1)/2. This need not be a multiple of the interpolation
factor.
I think that the proper solution would let us set the output delay with
a granularity of one output sample. It seems this would require more
work. I'm not sure whether the tag propagation schemes already
support this.
Best,
Daniel.