-1

Suddenly this line

$data_to_send = @file_get_contents($source);

giving me an error

{"error":{"type":"Symfony\Component\Debug\Exception\FatalErrorException","message":"Allowed memory size of 536870912 bytes exhausted (tried to allocate 353912831 bytes)","file":"/home/forge/biossantibodies.com/app/commands/FileName.php","line":157}}

upgrade VM resources

I already upgrade my Linode VM to this plan already and still didn't seeing the error.


update PHP.ini

I checked my php.ini, and have updated to

cat php.ini | grep _max                                                                                                         
log_errors_max_len = 1024
post_max_size = 2000M
upload_max_filesize = 2000M
session.gc_maxlifetime = 1440
;       setting session.gc_maxlifetime to 1440 (1440 seconds = 24 minutes):

restart php-fpm service

As you can see, I increase the memory allow to 2000M already.

I also reboot my php-fpm right that

service php5-fpm restart


phpinfo()

  • My changes seem to be reflected.

enter image description here

enter image description here


restart entire VM

I even tried to reboot the entire VM.

I still face the same issue, did I change the wrong file?

How do I double check?

Community
  • 1
  • 1
code-8
  • 54,650
  • 106
  • 352
  • 604
  • To debug: Add an `echo` that would output the `$source` and filesize just before that line and see what it comes up with. – KIKO Software Jan 31 '18 at 01:46
  • Let me do that now, and will update the post. – code-8 Jan 31 '18 at 01:47
  • @KIKOSoftware : I got the size of file is : `345620` – code-8 Jan 31 '18 at 01:52
  • See it here : https://www.dropbox.com/s/ml2idhm33mw0x5o/%202018-01-30%20at%208.53.14%20PM.png?dl=0 – code-8 Jan 31 '18 at 01:53
  • And is `fire()` only called once? Usually when you run out of memory you're stuck in some kind of loop. – KIKO Software Jan 31 '18 at 01:54
  • The script basically loop the entire rows in table of a database, and exported it out to a csv file and SFTP upload it into a SFTP server. Is it loop trhough? yes. As of why it is stuck, I think it is stuck during the upload, but I am not sure. – code-8 Jan 31 '18 at 01:56
  • The code has been working for 3 years, but now the filesize getting huge, is there anything PHP or Nginx configuration that I should look into to change ? I have a feeling that code is less suspicious at this point since it used to work. – code-8 Jan 31 '18 at 01:58
  • I don't know exactly what your code does, but given a file size of 345620 (bytes?) the line generating the error is probably not the exact line where the problem is. I think it has something to do with the code since 536870912 bytes is quite a lot. You could use the next command to check memory usage during execution: http://php.net/manual/en/function.memory-get-usage.php again with an echo. – KIKO Software Jan 31 '18 at 02:01
  • It may be worth checking whether you have a different `php.ini` file for the command line. – fubar Jan 31 '18 at 02:31
  • what commands shold I run ? But I check phpinfo() already. – code-8 Jan 31 '18 at 02:41

1 Answers1

3

"Allowed memory size of 536870912 bytes exhausted (tried to allocate 353912831 bytes)"

This means you need to also update your php.ini memory_limit directive.

Try put in your php.ini:

memory_limit=1024M
  • Good point, I update mine from 1024M to 4096M. Test it with phpinfo() show correctly, and re-attempt still happening. – code-8 Jan 31 '18 at 11:53
  • Might be worth checking with the hosting company what your memory limit is with them. If they are limiting you to a certain amount, it wouldn't matter what you changed your php.ini to. – sdexp Feb 02 '18 at 10:34
  • See [this question](https://stackoverflow.com/questions/34864524/allowed-memory-size-of-536870912-bytes-exhausted-in-laravel). You should try restarting nginx/apache. – sdexp Feb 02 '18 at 10:52