I have large file around 6GB csv, containing 37000000 lines. I need to upload all these lines using below sample request
curl --location --request POST 'http://localhost:7234/feedback/ingest/csv' \
--header 'charset: UTF-8' \
--form 'file=@"/home/new_file_1.csv"'
Destination api has constraint of 2MB(~ 12000 lines). I also have disk constraint(max 400MB more), so can't split into multiple small files. Only way I find in python is to iterate over rows and at each chunk(~10000Lines) create and dump in a temp file and fire post request. Is there any other better way?