0

I create a CSV file in Ror 3.1 by getting an external datatable(70 sec for select) then looping through that datatable and writing eachline of a CSV file. My data source usually has around 500,000 rows in it.This process takes 10 minutes to download from the browser. I use next code to generate csv

<%=
response.content_type = 'application/octet-stream'

CSV.generate(:col_sep => ?; ) do |csv|
csv << ['date', 'open', 'high', 'low', 'close']
@sql_testa.each do |row|
  csv << row
end
end
%>

dev.log

500 000 rows: Rendered forex/index.csv.erb (965590.2ms) Completed 200 OK in 965621ms (Views: 965598.2ms | ActiveRecord: 3.0ms) CSV bencmark 963245.2ms

50 000 rows: Rendered forex/index.csv.erb (4986.3ms) Completed 200 OK in 5022ms (Views: 4994.3ms | ActiveRecord: 5.0ms) CSV benchmark 4413.3ms

How to optimize it? Should I use objective C to extend ruby? (and how?)

I delete CSV.generate and use just this code in index.csv.erb

$sql_testa.map{ |i|  %Q('#{i}') }.join(",").delete("'[").gsub("],", "\n").delete('""').gsub(",", ";").gsub(/]\Z/,'')

Surprisingly that csv file also generated without CSV.generate and it takes 5 seconds. So, i solved it.

Alex D.
  • 97
  • 1
  • 8

2 Answers2

0

Have you checked where are the bottlenecks? Are they in the CSV library or with when sending to the client?

I would reduce the scope to this method and check how fast it will be executed.

If it's fast enough I would try to generate the data into a temporary file and send it to the client with a faster way (for example as a static file).

KARASZI István
  • 30,900
  • 8
  • 101
  • 128
0

It's probably not the file creation time as much as it is the download time. Try using the Fastercsv gem, and try using the ruby-zip gem also to zip the file prior to download.

RubyRedGrapefruit
  • 12,066
  • 16
  • 92
  • 193
  • Does it needs to use the FasterCSV gem with Ruby 1.9.2? I found [here](http://stackoverflow.com/questions/6090799/fastercsv-error-with-ruby-1-9-2) that it is standard library's CSV. – Alex D. Nov 26 '11 at 20:32