1

For a rest API, I need to serialize big and deeply nested json objects;

Is there a way to echo the json while it is generated?

If I echo json_encode($var) , it will generate the whole json and then spit it out: it costs memory, and I have to wait for the whole generation to be complete, before I start to see some response in my browser.

Is there any way to force json_decode to have this kind of behaviour?

Or are there libraries that can do this kind of stuff?

  • 1
    No. Unless you want to roll your own encoder. a JSON-encoded data structure of any reasonable amount of complexity is going to be a huge mess of nested `[ ... ]` and `{ .... }` pairs. outputting on the go is possible, but you're highly likely to end up with partial/corrupt strings at the other end, especially if the encoding process causes an OOM condition or whatever. – Marc B Aug 01 '14 at 16:12

2 Answers2

0

Your only option is to iterate over the object you are encoding and to encode it item by item, probably writing your own json notation to contain the generated json.

For example

echo "["; // or { for an associative array (don't forget to output the json_encode()d keys and their colons)
$first = true;
for ($i = 0; $i < 10000000; $i++) {
    // json does not support trailing commas like php does
    // and you can't do an implode(",", $array) here because of the memory usage
    if ($first) {
        $first = false;
    } else {
        echo ",";
    }
    echo json_encode($i);
}
echo "]"; // or }

taken from here: http://forums.phpfreaks.com/topic/288266-json-encode-exceeds-allowed-memory/

edmondscommerce
  • 2,001
  • 12
  • 21
-2

You can use smarty template to put json just reading file from disk. Or you can check this https://stackoverflow.com/a/4820537/3900160

Community
  • 1
  • 1