One advantage of LOESS is that it is quite stable against outliers: It iterates multiple times over the residual to the fit.
Each residuum is compared against the median residuum s: any residuum r is normalized to x = r/(6s), and then weighted by an bi-square weight function
In a new iteration the weights of the weighted least squares are then adjusted by the factor B.
What is bad about LOESS: this is expensive! What is nice about Savitzky-Golay: it is fast - but it can't include data-dependent weights, or it loses its advantage of convolving uniform weights. So lets just change the approximation data iteratively:
- Do a first round of Savitzky-Golay
- calculate interpolated points and residuuals
- do a weighted linear interpolation between original data and interpolation values, so use the residuum-based weights to change the data points
- and then iterate.
This should have a similar effect, but be much more fast. (disclaimer: this has not been tried yet...)