使用 JTransforms 库通过 FFT 计算自相关
我正在尝试使用以下代码计算时间序列中示例窗口的自相关性.我将 FFT 应用于该窗口,然后计算实部和虚部的大小并将虚部设置为零,最后对其进行逆变换以获得自相关:
I'm trying to calculate autocorrelation of sample windows in a time series using the code below. I'm applying FFT to that window, then computing magnitudes of real and imaginary parts and setting imaginary part to zero, lastly taking inverse transform of it to obtain autocorrelation:
DoubleFFT_1D fft = new DoubleFFT_1D(magCnt);
fft.realForward(magFFT);
magFFT[0] = (magFFT[0] * magFFT[0]);
for (int i = 1; i < (magCnt - (magCnt%2)) / 2; i++) {
magFFT[2*i] = magFFT[2*i] * magFFT[2*i] + magFFT[2*i + 1] * magFFT[2*i + 1];
magFFT[2*i + 1] = 0.0;
}
if (magCnt % 2 == 0) {
magFFT[1] = (magFFT[1] * magFFT[1]);
} else {
magFFT[magCnt/2] = (magFFT[magCnt-1] * magFFT[magCnt-1] + magFFT[1] * magFFT[1]);
}
autocorr = new double[magCnt];
System.arraycopy(magFFT, 0, autocorr, 0, magCnt);
DoubleFFT_1D ifft = new DoubleFFT_1D(magCnt);
ifft.realInverse(autocorr, false);
for (int i = 1; i < autocorr.length; i++)
autocorr[i] /= autocorr[0];
autocorr[0] = 1.0;
第一个问题是:可以看出这段代码将自相关结果映射到[0,1]
范围,虽然相关性应该在-1和1之间.当然很容易将结果映射到 [-1,1]
范围,但我不确定此映射是否正确.我们如何解释生成的 autocorr
数组中的值?
The first question is: It is seen that this code maps autocorrelation result to [0,1]
range, although correlation is supposed to be between -1 and 1. Of course it is easy to map the results to [-1,1]
range, but I'm not sure if this mapping is correct. How can we interpret the values in the resulting autocorr
array?
其次,使用此代码,我在某些周期性序列中得到了很好的结果,也就是说,我根据信号的周期为特定的自相关指数获得了更高的值.但是,当我将其应用于非周期性信号时,结果变得很奇怪:autocorr
数组中的所有值似乎都非常接近 1.这是什么原因?
Secondly, with this code I'm geting good results for some periodic series, that is I get higher values for specific autocorrelation indices in accordance with the period of signal. However the result go weird when I apply it to non-periodic signals: all the values in autocorr
array appear to be very close to 1. What is the reason for that?
推荐答案
要使基于 FFT 的算法正常工作,您必须特别注意定义,包括您正在使用的库的约定.您似乎混淆了 AC 的信号处理"约定和统计"约定.然后是 FFT 换行和零填充.
For FFT-based algorithms to work, you must pay careful attention to the definitions, including conventions of the library you are using. You seem to be confusing the "signal processing" convention for AC and the "statistical" one. And then there is FFT wrapping and zero padding.
这是一个适用于偶数 N 情况的代码,即信号处理约定.它针对蛮力包装的自相关进行了测试.注释显示了如何将其转换为信号处理约定.对于统计 ac,减去数据的平均值.这可以通过将 FFT 的0Hz"分量归零来完成.那么 ac 的第零个元素就是方差,你可以通过除以这个量来归一化.如您所说,结果值将落在-1..1.
Here is a code that's working for the even N case, signal processing convention. It's tested against a brute force wrapped autocorrelation. The comments show how to convert it to the signal processing convention. For statistical ac, the mean of the data is subtracted out. This can be done merely by zeroing out the "0Hz" component of the FFT. Then the zero'th element of the ac is the variance, and you can normalize by dividing through by this quantity. The resulting values will fall in -1..1 as you say.
您的代码似乎在进行除法,但并未忽略数据的 0 Hz 分量.所以它正在计算某种约定的混搭.
Your code seems to be doing the dividing through, but not ignoring the 0 Hz component of the data. So it's computing some kind of mashup of the conventions.
import edu.emory.mathcs.jtransforms.fft.DoubleFFT_1D;
import java.util.Arrays;
public class TestFFT {
void print(String msg, double [] x) {
System.out.println(msg);
for (double d : x) System.out.println(d);
}
/**
* This is a "wrapped" signal processing-style autocorrelation.
* For "true" autocorrelation, the data must be zero padded.
*/
public void bruteForceAutoCorrelation(double [] x, double [] ac) {
Arrays.fill(ac, 0);
int n = x.length;
for (int j = 0; j < n; j++) {
for (int i = 0; i < n; i++) {
ac[j] += x[i] * x[(n + i - j) % n];
}
}
}
private double sqr(double x) {
return x * x;
}
public void fftAutoCorrelation(double [] x, double [] ac) {
int n = x.length;
// Assumes n is even.
DoubleFFT_1D fft = new DoubleFFT_1D(n);
fft.realForward(x);
ac[0] = sqr(x[0]);
// ac[0] = 0; // For statistical convention, zero out the mean
ac[1] = sqr(x[1]);
for (int i = 2; i < n; i += 2) {
ac[i] = sqr(x[i]) + sqr(x[i+1]);
ac[i+1] = 0;
}
DoubleFFT_1D ifft = new DoubleFFT_1D(n);
ifft.realInverse(ac, true);
// For statistical convention, normalize by dividing through with variance
//for (int i = 1; i < n; i++)
// ac[i] /= ac[0];
//ac[0] = 1;
}
void test() {
double [] data = { 1, -81, 2, -15, 8, 2, -9, 0};
double [] ac1 = new double [data.length];
double [] ac2 = new double [data.length];
bruteForceAutoCorrelation(data, ac1);
fftAutoCorrelation(data, ac2);
print("bf", ac1);
print("fft", ac2);
double err = 0;
for (int i = 0; i < ac1.length; i++)
err += sqr(ac1[i] - ac2[i]);
System.out.println("err = " + err);
}
public static void main(String[] args) {
new TestFFT().test();
}
}
相关文章