I am trying to use this as a learning exercise. I have a dataset of 343345 rows, and need to geocode the data in ArcGIS, however the full dataset is too large to run properly. I need to break up the data into chunks of 50,000 rows, with the last chunk being a bit smaller.
Currently, I am doing this manually, in individual chunks of 50,000, as seen here
gen_gis_test_1_50 = gen_gis[c(1:50000),]
gen_gis_test_51_100 = gen_gis[c(50001:100000),]
Then I use write.csv()
for each new variable I created.
Ive done many of these rote exercises and wanted to see what the best method for writing a function would be. Ideally the function would read the main file, then spit out a new variable for every 50,000 rows. Then id like to write a second function that goes through each of those and writes it as a csv.
Many thanks in advance!