5

I just installed node-msgpack and tested it against native JSON. MessagePack is much slower. Anyone know why?

Using the authors' own benchmark...

node ~/node_modules/msgpack/bench.js 
msgpack pack:   4165 ms
msgpack unpack: 1589 ms
json    pack:   1352 ms
json    unpack: 761 ms
Brian Carlson
  • 61
  • 1
  • 3
  • 1
    One of the contributors to node-msgpack has addressed this issue here: https://github.com/pgriess/node-msgpack/issues/38#issuecomment-22719635 – Andrew Newdigate Nov 25 '13 at 14:06

2 Answers2

11

I'll assume you're talking about https://github.com/pgriess/node-msgpack.

Just looking at the source, I'm not sure how it could be. For example in src/msgpack.cc they have the following:

Buffer *bp = Buffer::New(sb._sbuf.size);
memcpy(Buffer::Data(bp), sb._sbuf.data, sb._sbuf.size);

In node terms, they are allocating and filling a new SlowBuffer for every request. You can benchmark the allocation part by doing following:

var msgpack = require('msgpack');
var SB = require('buffer').SlowBuffer;
var tmpl = {'abcdef' : 1, 'qqq' : 13, '19' : [1, 2, 3, 4]};

console.time('SlowBuffer');
for (var i = 0; i < 1e6; i++)
    // 20 is the resulting size of their "DATA_TEMPLATE"
    new SB(20);
console.timeEnd('SlowBuffer');

console.time('msgpack.pack');
for (var i = 0; i < 1e6; i++)
    msgpack.pack(tmpl);
console.timeEnd('msgpack.pack');

console.time('stringify');
for (var i = 0; i < 1e6; i++)
    JSON.stringify(tmpl);
console.timeEnd('stringify');

// result - SlowBuffer: 915ms
// result - msgpack.pack: 5144ms
// result - stringify: 1524ms

So by just allocating memory for the message they've already spent 60% of stringify time. There's just one reason why it's so much slower.

Also take into account that JSON.stringify has gotten a lot of love from Google. It's highly optimized and would be difficult to beat.

Trevor Norris
  • 20,499
  • 4
  • 26
  • 28
  • 3
    **"It's highly optimized and would be difficult to beat."** +1 - No one probably wants to be optimizing C++ string marshaling between msgpack zone and V8 when there's super fast JSON already anyway. – chakrit Aug 14 '13 at 08:26
  • I agree about JSON.stringify best performances, but what about native JSON implementation like NSJSONSerialization on iOS? Will MessagePack for ObjectiveC better that it? – loretoparisi Oct 09 '14 at 21:39
  • @loretoparisi Feel like we might be comparing apples to oranges. Does your program use both Node and ObjectiveC? – Trevor Norris Oct 13 '14 at 21:10
  • Yep! No ways to compare apples to oranges! Nope, my consideration were about the use of a binary serializer instead of a json serializer on mobile clients (like iOS). See here: https://gist.github.com/frsyuki/2908191 – loretoparisi Oct 13 '14 at 21:55
3

I decided to benchmark all popular Node.js modules for binary encoding Msgpack, along with the PSON (protocol JSON) encoding library, versus JSON, and the results are as follows:

  • JSON fastest for encoding unless it includes a binary array
  • msgpack second fastest normally and fastest when including a binary array
  • msgpack-js - consistently second to msgpack
  • pson - consistently slower than msgpack-js
  • msgpack5 - dog slow always

I have published the benchmarking repository and detailed results at https://github.com/mattheworiordan/nodejs-encoding-benchmarks

Matthew O'Riordan
  • 7,981
  • 4
  • 45
  • 59
  • But, were those modules installed natively? It wouldn't be a fair benchmark then, though? (since JSON is native) -- Obviously it will perform faster – NiCk Newman Apr 22 '16 at 06:27