Reduce Phase of a filtered data set

Started by shaddoll 6 years ago3 replieslatest reply 6 years ago64 views

Hi :)

So when you filter data sets using moving average, median or any other filter, they produce some kind of shift and they cant properly follow the original data.

I work with dynamic data so i asked me if there any way to reduce the shift produced by the filter, i've heard about FiR and IIR filters and forward-backward filtering but i don't know how to compute them with time series.


[ - ]
Reply by Pierre_NowoAugust 31, 2017

Take a look at this: https://fr.mathworks.com/help/signal/ref/filtfilt....

The backlash of doing this on FPGA is that you have twice the resources needed for a simple filtering. What you can do is to keep the non filtered data with te same delays, which costs just few additional logic.

[ - ]
Reply by kazAugust 31, 2017

filtfilt doubles delay(&more) in physical sense but keeps phase for modelling in software. I don't know any other use of it.

Any signal is delayed through a filter. This is rule of God??

[ - ]
Reply by asserAugust 31, 2017

There is a special kind of filters called the minimum phase filters.

They have unlinear phase characteristic but their group delay is minimized. For example, if a FIR filter of the 32-th order has the group delay of 16 clock cycles,

then the same minimum phase filter has it of 7 clock cycles.

It is valuable for the real time systems with a feedback.

Of course, the forward-backward filtering does not fit the minimum group delay demands at all.