I have a webcrawl script, and a scraper script. When I integrate the two, I can get the array that I want from every webpage, but I can't save it to CSV file. I tried this code but it doesn't work:
CSV.open("scraper.csv", "wb") do |csv|
csv << ["date", "venue", "time", "race_number", "race_name", "track", "purse"]
csv << $race_data = [date, venue, time, race_number, race_name, track, purse]
end
Well, it did saves a CSV file, but the file just keeps on overwriting itself instead of making just one file-with-everything-on-it.