I'm provided with a .txt file every day which contains semicolon-separated data. Users of my app are meant to upload this file to the database daily.
Currently, I'm reading and storing the information as such:
$array = array();
$csv = str_getcsv($request->file, "\n");
foreach ($csv as &$row) {
$row = str_getcsv($row, ";");
$array[] = $row;
}
array_splice($array, 0, 1);
foreach ($array as &$row) {
$query = Table::firstOrNew(['col2' => $row[1], 'col3' => $row[2]]);
$query->col1 = $row[0];
$query->col2 = $row[1];
$query->col3 = $row[2];
$query->col4 = $row[3];
$query->col5 = $row[4];
// [...]
$query->col72 = $row[71];
$query->col73 = $row[72];
$query->save();
}
The thing with this method is that it takes too long to run successfully (the volume of data is about 5000 entries a day, which takes ~2 minutes to complete with the above code). As you can see, the amount of columns is immense, and the data has to be read as if dealing with a.CSV file, plus I cannot discard any of them at all.
Not to mention this problem increases in great magnitude if, for some reason, a user (or more) has to upload multiple days or even a month worth of data at once.
I need to figure out a better way to handle this situation. I've searched around for a solution but the best I could find was that I should use a for
loop instead of foreach
, which didn't really solve the issue.