2

I have a problem with simple update in my console command (Symfony based).

This is my code:

private
function showMemoryUsage()
{
    $memory = memory_get_usage();
    $memory/= 1024;
    if ($memory < 1024) {
        $this->output->writeln('Memory usage: ' . round($memory, 2) . ' KB');
        return;
    }

    $memory/= 1024;
    if ($memory < 1024) {
        $this->output->writeln('Memory usage: ' . round($memory, 2) . ' MB');
        return;
    }

    $memory/= 1024;
    if ($memory < 1024) {
        $this->output->writeln('Memory usage: ' . round($memory, 2) . ' GB');
        return;
    }
}

// ...

$stmt = $conn->prepare("UPDATE product_counters SET counter = :count WHERE id = :id");
$this->showMemoryUsage();

foreach($cursor as $k => $counterData) {
    if ($counterData['_id']) {
        $stmt->execute([':count' => (int)$counterData['count'], ':id' => (int)$counterData['_id']]);
        $stmt->closeCursor();
    }
}

$this->showMemoryUsage();

I have to update 88 000 rows.

First memory usage is only 14 MB. Second memory usage is about 400 MB. How can I decrease this usage? I've tried to use gc_enable && gc_collect_cycles, I've tried to unset $stmt after foreach, but with no result. Can you suggest a way to decrease memory usage?

michail_w
  • 4,318
  • 4
  • 26
  • 43
  • 1
    Why `$stmt=null;`? – hjpotter92 Apr 24 '16 at 13:46
  • Don't nullify your statement, that doesn't do anything for your code at all, it makes no sense. – N.B. Apr 24 '16 at 13:57
  • Sorry, it's mistake. I've tried many possibilities. I've edited first post. Now it's correct, but memory is still overloaded. – michail_w Apr 24 '16 at 17:17
  • Where are you getting values to update your table? From an api, a feed, from table, user based ? – HddnTHA Apr 24 '16 at 17:22
  • I'm getting them from MongoDB, just before. It's simple `array('id'=>'value')`. Data from MongoDB are paginated into 1000 rows. While I'm processing single page, I run PDO update statement, which is presented in first post. I've checked memory usage by Mongo queries, and data which I get from Mongo. It doesn't overload my memory as much as this PDO statement does. Mongo takes about 60MB of data, and when I unset those values, memory i free. I'm sure, that PDO makes problems here. – michail_w Apr 24 '16 at 17:27
  • Take a look http://stackoverflow.com/questions/6895098/pdo-mysql-memory-consumption-with-large-result-set – HddnTHA Apr 24 '16 at 17:31
  • @HddnTHA thank you, but it doesn't solve my problem. I do not fetch data, because I'm running update, not a select. I have tried buffered queries option, but without positive result. – michail_w Apr 24 '16 at 20:25

0 Answers0