From: Matthias Brennwald <address@hidden>
Subject: Why is my Butterworth filter so noisy?
To: address@hidden
Date: Monday, September 22, 2008, 5:40 AM
Dear all
I need to apply a Butterworth low-pass filter to a
regularly sampled
time series of data points. I used the butter.m and
filter.m
functions to do that (the butter.m function is in the
signal package
from Octave-Forge). If I calculate and plot the frequency
response of
the Butterworth filter using the fourier transforms of the
original
and the filtered time series, the result looks like the
expected
Butterworth transfer function, but with a lot of unexpected
noise.
The amount of noise increases with the order of the
Butterworth
filter I use.
Am I doing something wrong or am I missing something?
The following is an example to reproduce the above with a
'faked'
time series:
---------------------
N = 1000; % number of data points
t = [1:N] / N; % time
x = 2*randn(1,N)-1; % fake time series (original data)
[b,a] = butter (6,0.3); % filter coefficients for a 6th
order
Butterworth filter
y = filter (b,a,x); % filter the original data
X = fft (x); X = X(1:N/2); % Fourier transform of the
original data
(only the left part of the spectrum)
Y = fft (y); Y = Y(1:N/2); % Fourier transform of the
filtered data
(only the left part of the spectrum)
trsf = 20*log10(abs(Y./X)); % transfer function in dB
f = [1:N/2]; % frequency
semilogx (f,trsf) % plot the transfer function
---------------------
Matthias
_______________________________________________
Help-octave mailing list
address@hidden
https://www-old.cae.wisc.edu/mailman/listinfo/help-octave