I've got a CSV file of members that I receive once a month that contains ~6000 rows.
I'm (trying to) loop through the CSV file, check if the record
already exists in the members
table, and if so check whether it's the same data.
Then insert it into the pending
table (with a exists flag where appropriate).
I'm using Laravel and League\CSV to read in the file that is saved in my storage
folder:
class ImportController extends Controller
{
public function import(Request $request) {
$readDirectory = 'storage/csv/';
$filename = $request->name;
$stream = fopen($readDirectory.$filename, 'r');
$reader = Reader::createFromStream($stream, 'r')->setHeaderOffset(0);
$records = (new Statement())->process($reader);
// Truncate the imported table prior to import
Imported::truncate();
foreach ($records as $record) {
$email = $record['email'];
$recordExists = $this->recordExists($email);
if($recordExists) {
// Compare the md5 of the recordArray and the memberArray and skip the record if thit's the same.
$memberArray = $this->getmemberArray($recordExists);
$recordArray = $this->getRecordArray($record);
if($memberArray['hash'] === $recordArray['hash']) { continue; }
$record['exists'] = TRUE;
$this->write($record);
continue;
}
else
{
$record['exists'] = FALSE;
$this->write($record);
Log::debug("missing: ".$record['URN']);
continue;
}
};
// End Foreach Loop
return redirect()->route('upload.show');
}
public function recordExists($urn){
$member = Member::where('email', 'LIKE', $email)->first();
if ($member == null) { return false; }
return $member;
}
public function getmemberArray($member) {
$memberArray = [
'email' => $member->email,
'first_name' => $member->first_name,
'last_name' => $member->last_name,
'age_years' => $member->age_years,
'gender' => $member->gender,
'address_1' => $member->address_1,
'address_2' => $member->address_2,
'address_3' => $member->address_3,
'town' => $member->town,
'county' => $member->county,
'postcode' => $member->postcode,
'sport_1' => $member->sport_1,
'sport_2' => $member->sport_2,
];
$memberArray['hash'] = md5(json_encode($memberArray));
return $memberArray;
}
public function getRecordArray($record) {
$recordArray = [
'email' => $record['email'],
'first_name' => $record['first_name'],
'last_name' => $record['last_name'],
'age_years' => $record['age_years'],
'gender' => $record['gender'],
'address_1' => $record['address_1'],
'address_2' => $record['address_2'],
'address_3' => $record['address_3'],
'town' => $record['town'],
'county' => $record['county'],
'postcode' => $record['postcode'],
'sport_1' => $record['sport_1'],
'sport_2' => $record['sport_2'],
];
$recordArray['hash'] = md5(json_encode($recordArray));
return $recordArray;
}
public function write($record) {
$import = [];
$import['email'] = $record['email'],
$import['first_name'] = $record['first_name'],
$import['last_name'] = $record['last_name'],
$import['age_years'] = $record['age_years'],
$import['gender'] = $record['gender'],
$import['address_1'] = $record['address_1'],
$import['address_2'] = $record['address_2'],
$import['address_3'] = $record['address_3'],
$import['town'] = $record['town'],
$import['county'] = $record['county'],
$import['postcode'] = $record['postcode'],
$import['sport_1'] = $record['sport_1'],
$import['sport_2'] = $record['sport_2'],
$import['exists'] = $record['exists']
DB::table('imported')->insert(
$import
);
Log::debug($record['email']);
return TRUE;
}
}
But I keep getting:
Symfony \ Component \ Debug \ Exception \ FatalErrorException (E_UNKNOWN)
Allowed memory size of 134217728 bytes exhausted (tried to allocate 181321056 bytes)
It works if I use a lot less rows in my CSV, but that's not an option.
I was previously writing to the DB using eloquent->save()
, but changed it to DB::table()->insert
to improve performance.
I've already added the following for testing purposes, but it's still breaking.
set_time_limit(0);
ini_set('max_execution_time', 100000);
ini_set('memory_limit','512m');
Am I missing something? Some kind of memory leak somewhere?
I'm guessing it's keeping the record in memory each time, so is there any way to make it forget after each row?
ALSO: Is there a way to clear this memory, so that I can edit the code and retry?
Even if I stop and re-run php artisan serve
it still keeps the same error message.