I am extracting the data from the server in R. I want to export that data (data frame) to CSV. Since the data is very big, write.csv is taking too much time or throwing an error. Is there any way to write the data to csv faster?
Asked
Active
Viewed 979 times
3
-
3Try `data.table::fwrite()` – Rohit May 02 '19 at 11:44
-
1Possible duplicate of [write.csv for large data.table](https://stackoverflow.com/questions/12013953/write-csv-for-large-data-table) – Hector Haffenden May 02 '19 at 11:46
-
2`fwrite` is the fastest way to write a data table to csv, available in `data.table` package. https://www.r-bloggers.com/fast-csv-writing-for-r/ – Shijith May 02 '19 at 11:48
-
From the server? If database server, consider keeping big data in a DBMS and not flat file. – Parfait May 02 '19 at 12:48
1 Answers
0
You could seperate your data in chunks and append them to your csv file one by one in a for loop.
write.table(chunk, file = "df.csv", sep = ",", col.names = F, row.names = F, append = T)

Derek van Tilborg
- 53
- 6