0

I'm working with an API that calls an file from a remote server. This file is roughly 110,592 bytes (.11 MB).

However, I have a loop that runs through each iteration of $data, and it is occasionally returning me an error of:

PHP Fatal error:  Out of memory (allocated 97255424) (tried to allocate 87 bytes)

I'm not sure why this occasionally returns this. I have very similar code on another page and it typically works with zero problems.

$json = file_get_contents('https://api.com/?limit=200');
$data = json_decode($json);

$x = 0;
while($x < sizeof($data)) {
    $Id = $data[$x]->id;
    $Name = $data[$x]->name; //ERROR LINE
    $price = $data[$x]->price_usd;
    $time = $data[$x]->last_updated; //SOMETIMES ERROR LINE
    $sql = "INSERT INTO table (Id, name, price, time) 
            VALUES ('".$Id."', '".$Name."', '".$price."', '".$time."')";

    if ($conn->query($sql) === TRUE) {
        //echo "New record created successfully<br><br><br>";
    } else {
        die("Error: " . $sql . "<br>" . $conn->error."<br><br><br>");
    }

    $x = $x+1;
}

Any advice about this would be great. (I also simplified the code a bit. There were more variables in the loop and all variables were entered into the sql table)

EDIT: My php info is set to: memory_limit 1024M

Nick Chubb
  • 1,463
  • 6
  • 16
  • 26
  • You can change the memory limit on the fly: ini_set('memory_limit','nM'); where n is the amount you want to increase it to. – dudeman Mar 14 '18 at 22:13
  • 97Meg of memory allocated to PHP is hardly enough for BIG DATA – RiggsFolly Mar 14 '18 at 22:13
  • Your script is wide open to [SQL Injection Attack](http://stackoverflow.com/questions/60174/how-can-i-prevent-sql-injection-in-php) Even [if you are escaping inputs, its not safe!](http://stackoverflow.com/questions/5741187/sql-injection-that-gets-around-mysql-real-escape-string) Use [prepared parameterized statements](http://php.net/manual/en/mysqli.quickstart.prepared-statements.php) – RiggsFolly Mar 14 '18 at 22:14
  • memory_limit 1024M <---- This is my php memory limit from php info @dudeman - Is my webhost restricting me? – Nick Chubb Mar 14 '18 at 22:24
  • @Joe, maybe. Can you page your api requests? $json = file_get_contents('https://api.com/?limit='.$pagesize.'&offset='.$pagenumber); or something like that. – dudeman Mar 14 '18 at 22:33

1 Answers1

0

I'm not sure what "api.com" (your api that you are talking to) allows you to do, but you should be able to reduce the set of data you have to work on for each loop. Maybe something like this:

$pagesize = 100;
$pagenumber = 0;
while $data {
   $data = file_get_contents('https://api.com/?limit='.$pagesize.'&limit-'.$pagenumber);
   ....do all the stuff
   $pagenumber++;
}
dudeman
  • 503
  • 3
  • 11
  • Thank you for the answer @dudeman. I actually found the issue. It was because of an SQL query above this code. I was working with a big table and there was no LIMIT in the query. So it would try to load the entire table. – Nick Chubb Mar 14 '18 at 22:51