I'm working with an API that calls an file from a remote server. This file is roughly 110,592 bytes (.11 MB).
However, I have a loop that runs through each iteration of $data
, and it is occasionally returning me an error of:
PHP Fatal error: Out of memory (allocated 97255424) (tried to allocate 87 bytes)
I'm not sure why this occasionally returns this. I have very similar code on another page and it typically works with zero problems.
$json = file_get_contents('https://api.com/?limit=200');
$data = json_decode($json);
$x = 0;
while($x < sizeof($data)) {
$Id = $data[$x]->id;
$Name = $data[$x]->name; //ERROR LINE
$price = $data[$x]->price_usd;
$time = $data[$x]->last_updated; //SOMETIMES ERROR LINE
$sql = "INSERT INTO table (Id, name, price, time)
VALUES ('".$Id."', '".$Name."', '".$price."', '".$time."')";
if ($conn->query($sql) === TRUE) {
//echo "New record created successfully<br><br><br>";
} else {
die("Error: " . $sql . "<br>" . $conn->error."<br><br><br>");
}
$x = $x+1;
}
Any advice about this would be great. (I also simplified the code a bit. There were more variables in the loop and all variables were entered into the sql table)
EDIT: My php info is set to: memory_limit 1024M