2

I have heard that writing for loops in R is particularly slow. I have the following code which needs to run through 122,000 rows with each having 513 columns and transform them using fft() function:

for (i in 2:100000){
   Data1[i,2:513]<- fft(as.numeric(Data1[i,2:513]), inverse = TRUE)/512
}

I have tried to do this for 1000 cycles and that took few minutes... is there a way to do this loop faster? Maybe by not using a loop or by doing it in C?

Ben Bolker
  • 211,554
  • 25
  • 370
  • 453
Kamran
  • 75
  • 7

1 Answers1

3

mvfft (documented on the fft help page) was designed to do this all at once. It's hard to imagine how you could do it any faster: less than three seconds (on an older Xeon workstation) for a dataset exactly your size.

n.row <- 122e3
X <- matrix(rnorm(n.row*512), n.row)
system.time(
  Y <- mvfft(t(X), inverse=TRUE)/512
)

user system elapsed

2.34 0.39 2.75

Note that the discrete FFT in this case has complex values.


FFTs are fast. Typically they can be computed in less time than it takes to read data from an ASCII file (because the character-to-numeric conversions involved in the read take more time than the calculations in the FFT). Your limiting resources therefore are I/O throughput speed and RAM. But 122,000 vectors of 512 complex values occupy "only" about a gigabyte, so you should be ok.

whuber
  • 2,379
  • 14
  • 23