I've successfully written a script that downloads a CSV file with the use of cURL and then parses the CSV into a array as follows:
$rows = array_map(function($a) {
return str_getcsv($a, $delimiter);
}, explode("\n", $result));
Then I iterate over $rows
using foreach
to save something into the database.
The script works fine, but when using a larger CSV file (>10.000 lines), the script becomes rather slow and gives a lot more errors.
I would like to cut the CSV file into pieces, so not the whole file will be imported into a variable. I found the following solution, but that still processes the whole file at once.
Is there a method to cut the CSV into pieces and run the database function several times? Or is there a better way to process a large CSV file like this?
I'm relatively new to processing large files, so please be kind!