2

I have a problem with the deserialization with serviceStack.Text. The running Time of the serialization of a complex object increases exponentially with increasing complexity of the object . In particular, this object contains a list of simple objects , increasing the number of these items in the list , the time of serialization increases dramatically . how do I make it faster ?

These are my only configuration:

JsConfig.IncludeTypeInfo = true;
JsConfig.IncludePublicFields = true;
Soner Gönül
  • 97,193
  • 102
  • 206
  • 364
  • Impossible to help without code or an example of the data, but more data *does* need more time to process. Have you tried other deserializers like Json.NET? If they exhibit similar performance, or if eg ServiceStack slows exponentially where the others slow linearly, then the question would be worth investigating. Otherwise, just benchmark different parsers and pick the fastest. – Panagiotis Kanavos Nov 02 '15 at 13:59
  • I tried with Newtonsoft.Json and performance were much better, the slowdown was minimal and linear – Giuseppe Giuffre Nov 02 '15 at 14:02
  • Then there may be a bug in ServiceStack.Text indeed. What version did you use? The last freely available in NuGet is *very* old. Newer ones have trial limitations. Did you clone and try the latest from source? ServiceStack isn't much faster than Json.NET (if at all) anyway. That was true three years ago perhaps, but nowadays you'd have to run your own benchmarks (hint) – Panagiotis Kanavos Nov 02 '15 at 14:05
  • I have the latest version with license, no problem about it. – Giuseppe Giuffre Nov 02 '15 at 14:10
  • Then .... . For what it's worth, when I used the same client to call repeatedly similar Web API 2 and ServiceStack toy services, Web API 2 was ~5% faster - which may be due to test error. The reason SS benchmars showed a huge advantage was because they used *different* clients for SS and Web API *that deserialized* the test result, essentially measuring the speed of the test harness and deserializer, not the service itself. Using the same deserializer in both cases (Json.NET or SS.Text) eliminated the difference – Panagiotis Kanavos Nov 02 '15 at 14:17
  • +1 because I suspected I'd find differences when using meaningfully complex benchmarks but haven't gotten around to it. I expected to find them in memory usage over time or scalability though. – Panagiotis Kanavos Nov 02 '15 at 14:20
  • Unfortunately this isn't anywhere near enough info to identify the issue. Can you provide a full example of a Type that has an issue. `JsConfig.IncludeTypeInfo = true` is a bad default which forces unnecessary type info for each type, you should also avoid unknown properties like `object` or interface properties. Otherwise only place I know where SS.Text is slow is with Dates as it auto converts them to UTC on the wire and back into local time which is really slow in .NET and can heavily skew the benchmarks vs Serializers that don't convert Dates. – mythz Nov 02 '15 at 16:25

1 Answers1

1

I'd highly recommend against using:

 JsConfig.IncludeTypeInfo = true;

Which forces unnecessary type information to be included which unnecessarily bloats the payload. Ideally your DTO's should be well-defined and not contain unknown object or Interface properties which increases serializer-specific coupling and will fail to serialize in many standards-based serializers.

Community
  • 1
  • 1
mythz
  • 141,670
  • 29
  • 246
  • 390