Wednesday, May 12, 2010

Is it possible to get a causal smoothed filter ?

Although I haven't been all that much of a fan of moving average based methods, I've observed some discussions and made some attempts to determine if it's possible to get an actual smoothed filter with a causal model. Anyone who's worked on financial time series filters knows that the bane of filtering is getting a smooth response with very low delay. Ironically, one would think that you need a very small moving average length to accomplish a causal filter with decent lag properties; often a sacrifice is made between choosing a large parameter to obtain decent smoothing at the cost of lag.

I put together the following FIR based filter using QQQQ daily data for about 1 year worth of data. It is completely causal and described by .. gasp.. 250 coefficients.

Does it appear smooth? You decide.



Fig 1. FIR 250 tap feed forward filter



Fig 2. 250 weight impulse response determining coefficients

The impulse response is approximately a sinc function, which is the discrete inverse transform for an ideal 'brick wall' low pass filter.

I haven't actually verified much out of sample at the moment, so it's quite possible that the model may not fare as well; it remains to be investigated. However, thought I would share this work to give some ideas about potential of causal filtering methods.

9 comments:

  1. Looks smooth to me! I had done some smoothing like this in an entirely different context, ECG signal processing. I'm in biostat, so I'm not up on all the econ terminology, what does it mean to have an entirely causal model? Also, how are you computing the weighted average (in R I presume, but with convolve()? fft()? custom functions?)

    ReplyDelete
  2. Thanks Matt,

    I find a lot of good ideas in EEG/ECG processing as I feel that aside from stationarity, there are a lot of similarities in the signal content (flicker noise in sleep patterns for instance). Most often in EEG, I've seen wavelets applied, which are non-causal. The basic idea behind causality with regards to dsp filtering, is that the method only relies on past data and it does not get altered with new data. Often, when designing 'smoothing' filters, you are required to have both past and future data to get true smoothness (which is obviously impossible in financial data at the hard right edge).

    I'm keeping the method proprietary right now, but the basic idea is finding a novel method to calculate the coefficients (I haven't seen it anywhere else).

    ReplyDelete
  3. Hi, what do you mean by "described by .. gasp.. 250 coefficients" ?
    Cannot you apply just a "rolling low pass filter" to get your causal smoothing?

    ReplyDelete
  4. do you use a high order or low order low pass filter?

    ReplyDelete
  5. x,

    If you have some familiarity with FIR type filters, they are typically described by the windowing filter tap coefficients (in this case 250). Most filters you are likely familiar with typically have much smaller numbers of coefficients.

    The problem with a causal rolling low pass filter is that it is very difficult to simultaneously have good smoothing properties as well as low delay; which this filter achieved. It is possible, however, with non-causal low pass filters, but at the cost of requiring knowledge of the future, which we typically don't have the luxury of in financial time series.

    ReplyDelete
  6. It's really intriguing. Could you please show the frequency response of the filter,especially the phase resopnse? I reckon it cannot be a zero-phase response filter, so it still has some delay.

    ReplyDelete
  7. Hi Anon,
    Thanks for the reply. It's been a while since I set up this simulation and do not have the frequency response set up. Apologies for that. Also, although I can conceptually show a perfectly zero delay filter, we have to be careful about what that actually tells us... if we are using to detect direction any smoothing prior to the next change will be altered (i.e. in retrospect) in order to remain smooth.

    ReplyDelete
  8. hello IT,

    I have a couple of doubs.
    If we use for example a loess to smooth the data, every time we have a new value, the past smoothing curve changes. That means that is difficult to use the smooth curve for back test.
    Has a FIR filter the same problem? And Is there any R package to use it?
    Thanks and best regards,
    Juan

    ReplyDelete
  9. Hi Juan,
    That is big issue with filters; if we use certain filters (i.e. loess) data can and will change near end points as there is dependency on forward data. It's sort of a trade off for smoothness.

    FIR has similar issues. I don't know of a signal processing package in R made for just FIR type filters, but Python and Octave have many.

    ReplyDelete