0

I use the construction for read csv file (about 25000 lines) in my method update() of my class

if(($handle = fopen($this->csv_url, 'r')) !== false) {

        $row = 0;

        while(($data = fgetcsv($handle)) !== false)
        {
            if ($row === 0) { $row++; continue; }

            // Doing update a few meta in current product
            // I detect product by sku

            unset($data);
            $row++;
        }

        fclose($handle);
    }

Memory runs out on 3000 products. (250MB setup in php.ini file)

Does anyone have any ideas? Sorry for my English.

Serg S
  • 21
  • 2
  • First of all, you can try to use `$data = null` instead of unset for faster memory freeing (https://stackoverflow.com/questions/584960/whats-better-at-freeing-memory-with-php-unset-or-var-null). Also, what about setting higher `WP_MEMORY_LIMIT` and `WP_MAX_MEMORY_LIMIT`? – H. Bloch Feb 02 '20 at 14:00
  • Thanks for comment. I have tried use NULL - it still does not work. Also, memory_limit currently set 256MB (it is max). – Serg S Feb 07 '20 at 23:41
  • 256MB isn't max, I've set up for a special use case of a certain request to 4GB. (4096MB) – H. Bloch Feb 08 '20 at 20:28

1 Answers1

2

I have long resolved this issue; here is what I found:

In each loop iteration (I did it every 20), I use this function wp_cache_flush().

As a result, I have 86 MB of used memory without accumulation:

screenshot

Anton Menshov
  • 2,266
  • 14
  • 34
  • 55
Serg S
  • 21
  • 2