7

I'm using Dapper to read data from SQL Server. I have a SQL statement that returns a long Json result but the issue is this result being split into 3 rows with 2033 characters max per row, then Dapper can't parse the returned result because it's invalid Json.

How to prevent this splitting or how to make Dapper deal with it?

This is my code:

SqlMapper.ResetTypeHandlers();
SqlMapper.AddTypeHandler(new JsonTypeHandler<List<Product>>());

const string sql = @"SELECT 
                         *,
                         (SELECT * FROM Balance b
                          WHERE p.SKU = b.SKU 
                          FOR JSON PATH) AS [Balances]
                     FROM Product p
                     WHERE SKU IN @SKUs
                     FOR JSON PATH";
var connection = new SqlConnection("myconnection");
return connection.QuerySingleAsync<List<Product>>(sql, new{SKUs = new[] {"foo", "bar"}} });

And the code of TypeHandler:

public class JsonTypeHandler<T> : SqlMapper.TypeHandler<T>
    {
        public override T Parse(object value)
        {
            return JsonConvert.DeserializeObject<T>(value.ToString());
        }

        public override void SetValue(IDbDataParameter parameter, T value)
        {
            parameter.Value = JsonConvert.SerializeObject(value);
        }
    }

And here is how I run this SQL in DataGrip enter image description here Edit: Here is the error message:

Newtonsoft.Json.JsonSerializationException : Unexpected end when deserializing object. Path '[0].Balances[4].WarehouseId', line 1, position 2033.

kvuong
  • 485
  • 3
  • 20
  • please provide a [mcve] – Daniel A. White Mar 23 '19 at 17:52
  • Thanks @DanielA.White, I've updated my question. – kvuong Mar 23 '19 at 18:06
  • Are you sure the one-row data is splitting into 3 rows or there are three rows for three different records on the table? – Adnan Ahmed Ansari Mar 23 '19 at 18:12
  • @AdnanAhmedAnsari When I copy value of each row and try to parse it with https://jsoneditoronline.org/ then get invalid json message, but when combine values from all 3 rows, I get the result I need. Like I said above, row 1 and 2 have 2033 characters each. – kvuong Mar 23 '19 at 18:18
  • 2
    The behavior is documented [here](https://learn.microsoft.com/en-us/sql/relational-databases/json/format-query-results-as-json-with-for-json-sql-server?view=sql-server-2017#output-of-the-for-json-clause). I would not know how to fix with dapper. Perhaps you should concat the rows yourself before returning. – Jesse de Wit Mar 23 '19 at 18:23
  • @AdnanAhmedAnsari I've included exception message in my question. – kvuong Mar 23 '19 at 18:27
  • @JessedeWit thank you, that's exactly what I get now. Concating the rows by ourself seems not right solution because I don't know when the result become "long". – kvuong Mar 23 '19 at 18:37
  • When reading data into classes using dapper, JSON is the wrong tool to use. You're going from structured data in the db, to an unstructured format (json) and back to structured classes again. Instead of `FOR JSON PATH`, I would use multi-mapping to get the data into multiple objects. See https://stackoverflow.com/a/7478958/192221 for an example. – kristianp Mar 25 '19 at 05:36
  • Thanks @kristianp, I've converted code to use Multi mapping and multiple Resultsets, I used JSON approach to simplify the mapping part but looks like it's not simple as I thought. – kvuong Mar 25 '19 at 05:56
  • @kvuong you don't have to know when the result become "long", just concat the rows each time like explained in the msdn article https://learn.microsoft.com/en-us/sql/relational-databases/json/use-for-json-output-in-sql-server-and-in-client-apps-sql-server?view=sql-server-2017#use-for-json-output-in-a-c-client-app It will work if you have only one row or multiple ones – GuidEmpty Apr 17 '19 at 15:40
  • @GuidEmpty thanks for your comment, it reminds me about my solution, I just posted it. – kvuong Apr 17 '19 at 16:00

1 Answers1

4

My solution is writing another extension method that wraps Query<string> method likes below:

public static T QueryJson<T>(this IDbConnection cnn, string sql, object param = null,
        IDbTransaction transaction = null, bool buffered = true, int? commandTimeout = null,
        CommandType? commandType = null) where T: class
    {
        var result = cnn.Query<string>(sql, param, transaction, buffered, commandTimeout, commandType).ToList();
        if (!result.Any())
            return default(T);

        // Concats
        var sb = new StringBuilder();
        foreach (var jsonPart in result)
            sb.Append(jsonPart);

        var settings = new JsonSerializerSettings
        {
            // https://github.com/danielwertheim/jsonnet-contractresolvers
            // I use this Contract Resolver to set data to private setter properties
            ContractResolver = new PrivateSetterContractResolver()
        };

        // Using Json.Net to de-serialize objects
        return JsonConvert.DeserializeObject<T>(sb.ToString(), settings);
    }

This solution works quite well and slower then multiple mapping method when query large data (1000 objects took 2.7 seconds in compare to 1.3 seconds).

kvuong
  • 485
  • 3
  • 20