1

What I'm doing has already been implemented in Json.net by setting TypeNameHandling to TypeNameHandling.Objects. That way the object type will be serialized as well and the deserialized object will have the exact original type.

However using TypeNameHandling exposes some security issues and requires us to use a custom SerializationBinder to limit which types will be supported to avoid possible code injection. That's not the main reason for me trying to find another solution. Actually I find that by using TypeNameHandling.Objects, an object will be serialized to a complex JSON including not just the object data itself and the object type but also some other properties which look redundant to me.

I think we just need one more property containing info about object type (such as the assembly qualified name of the object type) so I would like to create a custom JsonConverter which will serialize any object to some JSON like this:

{
   "Item" : "normal JSON string of object",
   "ItemType" : "assembly qualified name of object type"
}

Isn't that just enough? As I said before, besides 2 properties similar to those (with different names), the Json.net lib includes some other properties (signature ...) which really look like redundant to me.

I'm not asking for how to implement the custom JsonConverter I mentioned above. I just wonder if that converter (with a simplified JSON structure) is fine or I should use the standard solution provided by Json.net with TypeNameHandling (which involves a more complex JSON structure)? My main concern is with possible performance issues with TypeNameHandling set to Objects due to more data to convert/serialize/transfer.

One more concern with the standard solution is performance issue, actually I just need to apply the custom converting logic to all objects of the exact type object, not to all other strongly typed objects (which may still be unnecessarily applied by TypeNameHandling?)

dbc
  • 104,963
  • 20
  • 228
  • 340
Hopeless
  • 4,397
  • 5
  • 37
  • 64
  • This seems like an opinion-based question, but one note... – dbc Feb 12 '19 at 07:00
  • The same security risks that can arise with `TypeNameHandling` can also arise with your custom `JsonConverter`. The risk here is that an attacker tricks some polymorphic deserialization code to instantiate an **attack gadget**. See [TypeNameHandling caution in Newtonsoft Json](https://stackoverflow.com/q/39565954/3744182) and [External json vulnerable because of Json.Net TypeNameHandling auto?](https://stackoverflow.com/q/49038055/3744182). – dbc Feb 12 '19 at 07:00
  • If an attacker crafts JSON with an attack gadget type specified in the `"ItemType"` property supported by your converter, then your converter may end up instantiating the attach gadget and effecting the attack. So you're still going to need something like a custom `SerializationBinder` to filter out naughty types. – dbc Feb 12 '19 at 07:02
  • You can reduce your attack surface by only enabling support for polymorphic deserialization for known polymorphic properties or arrays, by applying [`[JsonProperty(TypeNameHandling = TypeNameHandling.All)]`](https://www.newtonsoft.com/json/help/html/P_Newtonsoft_Json_JsonPropertyAttribute_TypeNameHandling.htm) or your custom `JsonConverter` only to the required properties -- but if the polymorphic base type of those properties just happens to be compatible with an attack gadget, you're going to be vulnerable to attack. – dbc Feb 12 '19 at 07:05
  • @dbc thank you, actually I understand about the security risk but it's not really a priority at the moment. As already mentioned in my question, the `Json.net` somehow serializes object into a more complex JSON structure whereas my custom converter will use a simplified JSON structure but I wonder if that's fine. They use a complex JSON maybe for some special case that I've not known of? (with my knowledge, to deserialize for an object of some type, we just need the JSON string containing data and the desired type which can be always obtained from the assembly-qualified name). – Hopeless Feb 12 '19 at 07:25
  • @dbc in other words, I'm more concerning about possible performance issue with `TypeNameHandling` set to `Objects` due to more data to convert/serialize/transfer. – Hopeless Feb 12 '19 at 07:30
  • Json.NET supports `TypeNameHandling` for arrays as well as objects, which your code would not seem to do. And I'm not sure I see why your JSON is more compact, since Json.NET is just adding a property `"$type"` while you are adding `"ItemType"`. So I don't see why one solution would be a priori more performant than the other. Why not just measure it? See https://ericlippert.com/2012/12/17/performance-rant/ – dbc Feb 12 '19 at 07:34
  • @dbc I'm not intending to keep exact collection type so my converter will just need to convert objects. However you're right about the only additional property `$type`, I'm not so sure why I saw at least 2 more properties starting with `Signature...` while debugging (to solve another problem involving serializing `PropertyInfo`). Thank you for your comments, maybe I'll go for the standard solution with `TypeNameHandling`. If you can put all your comments in an answer, I will be happy to accept it. – Hopeless Feb 12 '19 at 08:15
  • I have a database with objects stored in it. I have the `Type.FullName` and the object serialized as Json. It's quite simple to deserialize the Json as the given object type. Is that what you're looking for? – Reinstate Monica Cellio Feb 14 '19 at 08:48
  • @Archer actually my original concern was raised due to some mis-knowledge about the JSON structure used by `json.net` for saving type info. I thought that it includes some other properties (besides the `$type`) so that was one good reason for me to try my own converter instead (to avoid unnecessary text being transferred via network). Now I've already chosen `json.net` with `TypeNameHandling`. – Hopeless Feb 15 '19 at 02:10

1 Answers1

1

I have a few reactions to your proposed design for a polymorphic custom JsonConverter (let's call it PolymorphicConverter<T> where T is the base type):

  1. Regarding security, you wrote,

    ... using TypeNameHandling exposes some security issues and requires us to use a custom SerializationBinder to limit which types will be supported to avoid possible code injection.

    The same security risks that can arise with TypeNameHandling will also arise with PolymorphicConverter<T>.

    The risk here is that an attacker tricks some polymorphic deserialization code into instantiating an attack gadget. See TypeNameHandling caution in Newtonsoft Json and External json vulnerable because of Json.Net TypeNameHandling auto? for examples and discussion. If an attacker crafts JSON with an attack gadget type specified in the "ItemType" property supported by your converter, then it may end up instantiating the attack gadget and effecting the attack.

    You can reduce your attack surface by only enabling support for polymorphic deserialization for known polymorphic properties or arrays items by applying PolymorphicConverter<T> (or [JsonProperty(TypeNameHandling = TypeNameHandling.All)] for that matter) just to those properties that are actually polymorphic in practice -- but if the polymorphic base type of those properties just happens to be compatible with an attack gadget, you're going to be vulnerable to attack.

    Thus, no matter what mechanism is used, you're still going to need something like a custom SerializationBinder to filter out naughty types, no matter the details of how you encode type information in your JSON.

  2. JSON file size. Json.NET encodes type information by adding a single property to the beginning of objects:

    "$type" : "assembly qualified name of object type"
    

    Your plan is to instead add:

    "ItemType" : "assembly qualified name of object type"
    

    It is unclear why there would be an advantage, unless your type names are somehow more compact.

  3. Performance. You wrote,

    My main concern is with possible performance issues with TypeNameHandling set to Objects due to more data to convert/serialize/transfer.

    Firstly, why not just measure and find out? See https://ericlippert.com/2012/12/17/performance-rant/

    Secondly, Newtonsoft has a setting MetadataPropertyHandling, that, when set to Default, assumes that the polymorphic property "$type" comes first in each object, and thus is able to stream them in without pre-loading the entire JSON into a JToken hierarchy.

    If your converter unconditionally preloads into a JToken hierarchy to fetch the value of the "ItemType" property, it may have worse performance.

  4. Regarding restricting polymorphic deserialization to only required properties, you wrote:

    One more concern with the standard solution is performance issue, actually I just need to apply the custom converting logic to all objects of the exact type object, not to all other strongly typed objects

    Either way, this is possible with a custom ContractResolver. Override DefaultContractResolver.CreateProperty and, when JsonProperty.PropertyType == typeof(object), set TypeNameHandling or Converter as required, depending on your chosen solution.

dbc
  • 104,963
  • 20
  • 228
  • 340
  • 1
    thank you for the detailed answer. I would like to add another option (which may not be powerful/customizable as yours) for the issue number #4. We can use `TypeNameHandling.Auto` (I always used `TypeNameHandling.Objects` before). This `Auto` behavior will add type info only for property values whose actual `Type` is not exactly matching the declared `Type`. This suits my use case, all properties with explicit Type declared (not of `object`) should be deserialized back OK without redundant type info whereas all properties of `object` should be serialized with type info. – Hopeless Feb 15 '19 at 02:16