2

What's faster? Any thoguhts/benchmarks?

onassar
  • 3,313
  • 7
  • 36
  • 58
  • 5
    That's comparing apples and oranges. If anything, the question should be which one is more suited for UseCase X? And when asking for benchmarks, why not do some your own? – Gordon Nov 26 '10 at 22:09
  • Agree with **Gordon**, would have to know the case. But without any other info, I vote JSON. ;) – Jason McCreary Nov 26 '10 at 22:12
  • 1
    possible duplicate of [When to prefer JSON over XML?](http://stackoverflow.com/questions/325085/when-to-prefer-json-over-xml) – mario Nov 26 '10 at 23:12
  • Wrt "why not write your own" -- agreed, optimal results are gotten by testing one's own use case. But on the other hand there may exist well-written benchmarks done by experts who know of current best ways to handle XML and JSON; rolling your own runs the risk of novice mistakes. – StaxMan Nov 27 '10 at 02:43

1 Answers1

5

json_decode() is faster. No discussion. However the margin can only be benchmarked on a specific XML document type. XML-RPC marshalling isn't that far off from JSON e.g. But anyway, you have to decide on what kind of data you want to transfer or save:

JSON is suitable for representation of scalar data types, and arrays or objects.

XML is foremost a document format family. You can use it to serialize data types from any programming language; but that's not its purpose. Think of XML as document micro databases.

So really it is an apples to books comparison.


@StaxMan: unscientific proof follows. Note how this example is already skewed in favour of JSON by using a suboptimal pseudo datastructure.

$json = <<<END
   [55, "text goes here", 0.1]
END;

$xml = <<<END
<array>
   <int>55</int>
   <string>text goes here</string>
   <float>0.1</float>
</array>
END;

for ($i=0,$t=t(); $i<100000; $i++) {
   json_decode($json);
}
echo "json ", t(-$t), "\n";

for ($i=0,$t=t(); $i<100000; $i++) {
   simplexml_load_string($xml);
}
echo "xml ", t(-$t), "\n";

function t($t1=0) {
   $a = explode(" ", microtime());
   return $a[0] + $a[1] + $t1;
}

Result:

json 1.6152667999268
xml 2.9058270454407

Again, very nothingsaying. But it's a theoretic advantage for JSON.

mario
  • 144,265
  • 20
  • 237
  • 291
  • Any links to proof? I am asking because a few earlier questions pointed to results others had that suggested different outcome, not because I dispute claim itself. – StaxMan Nov 27 '10 at 02:42
  • 1
    @StaxMan I take it as personal offence that you are questioning my authority on making outrageous speed assumptions. But anyway, see above. I guess most test patterns would favour JSON. Even worse if you used proper XMLSchema data types. – mario Nov 27 '10 at 04:03
  • 1
    There's no way XML parsing would be faster. It has to traverse the whole DOM looking for CDATA, attributes and different node types. – Matt Williamson Nov 27 '10 at 04:04
  • Heh, for a while I thought you were seriously hurt... I am sometimes slow grokking subtle sophisticated humor. :) For what it's worth, JSON decoding being faster does make lots of sense from theoretical perspective; I just recall other questions where someone had written a test did not show JSON being measurably faster (I think it was http://stackoverflow.com/questions/993282/php-is-json-or-xml-parser-faster). Your additions look good and support your statement. – StaxMan Dec 02 '10 at 17:28
  • Matt: all I am saying is that differences between parser speeds (within class, ie. slowest and fastest xml parsers, slowest/fastest xml parsers) are HUGE -- I have written xml and json parsers (in Java), measured, and fastest xml parsers handily beat slowest xml parsers; and same in reverse direction. Best against best, yes, JSON parsers come ahead slightly, in rough relation to input size. – StaxMan Dec 02 '10 at 17:30