0

I have a webcrawl script, and a scraper script. When I integrate the two, I can get the array that I want from every webpage, but I can't save it to CSV file. I tried this code but it doesn't work:

CSV.open("scraper.csv", "wb") do |csv|
          csv << ["date", "venue", "time", "race_number", "race_name", "track", "purse"]
          csv << $race_data = [date, venue, time, race_number, race_name, track, purse]
end

Well, it did saves a CSV file, but the file just keeps on overwriting itself instead of making just one file-with-everything-on-it.

heartless
  • 11
  • 5
  • 2
    I believe this question has the answer you're looking for: http://stackoverflow.com/questions/4822422/output-array-to-csv-in-ruby – Lucas Moulin Feb 19 '16 at 01:55
  • I did this and the csv file isn't the compilation of the arrays I get. :( the csv file is only getting one row and then keeps on overwriting itself. I'm sorry, I just learned about ruby last week. – heartless Feb 19 '16 at 02:24

1 Answers1

0

found an answer to this. The problem was my CSV.open is in "w" which overwrites the files with new data. So, from

CSV.open("scraper.csv", "wb") do |csv|
      csv << ["date", "venue", "time", "race_number", "race_name", "track", "purse"]
      csv << $race_data = [date, venue, time, race_number, race_name, track, purse]
end

i put it this way:

CSV.open("scraper.csv", "ab") do |csv|
      csv << ["date", "venue", "time", "race_number", "race_name", "track", "purse"]
      csv << $race_data = [date, venue, time, race_number, race_name, track, purse]
end
heartless
  • 11
  • 5