I have a big csv with 116+ million observations. I need to break this down into lots of small CSV's so that I can run it through a different program that has a restrictive file size limit. Is there a way where
read df <- bigfile.csv
split df into 30 by rows, keeping headers write out all 30 csvs as littlefile_1.csv; littlefile_2.csv etc.
I know this is pretty rudimentary, but I am pretty new to R.