3

I'm working on a project in which I need the ability to receive and consume (i.e. extract information) from received JSONs. The current version of SQL Server I am using (and will not change for the next couple of years) is 2012, which does not include support for this (as opposed to version 2016).

I do recall seeing a post from someone in which he gently offered a source code for this, but unfortunately can't find it again.

The idea is to have something like:

Having the following JSON:

{
    "Obj1": {
        "Obj1_S_1": [{
            "Obj1_S_1_S_1": "Blabla_1"
        }, {
            "Obj1_S_1_S_1": "Blabla_2"
        }, {
            "Obj1_S_1_S_1": "Blabla_3"
        }, {
            "Obj1_S_1_S_1": "Blabla_4"
        }, {
            "Obj1_S_1_S_1": "Blabla_5"
        }, {
            "Obj1_S_1_S_1": "Blabla_6"
        }]
    },
    "Obj2": "This is a simple string",
    "Obj3": "456.33"
}

I could use the following invocation:

SET @My_Param = GET_JSON(@Source_JSON, '*.Obj1.Obj1_S_1[3].Obj1_S_1_S_1') ;

and I would get the value 'Blabla_4' into the variable @My_Param.

This is the very same syntax used in Oracle and MySQL by the way.

halfer
  • 19,824
  • 17
  • 99
  • 186
FDavidov
  • 3,505
  • 6
  • 23
  • 59
  • 2
    This is the best resource I know of to implement consuming JSON pre-2016: [Consuming JSON Strings in SQL Server - Phil Factor](https://www.simple-talk.com/sql/t-sql-programming/consuming-json-strings-in-sql-server/), [Producing JSON Documents from SQL Server queries via TSQL - Phil Factor](https://www.simple-talk.com/sql/t-sql-programming/producing-json-documents-from-sql-server-queries-via-tsql/) , [SQL Server JSON to Table and Table to JSON - Phil Factor](https://www.simple-talk.com/blogs/sql-server-json-to-table-and-table-to-json/) – SqlZim Feb 07 '17 at 13:03
  • Thank you @SqlZim for your suggestion. I've seen that proposal but it is not what I'm looking for (perhaps I'm not fully understanding it). Two main reasons: (1) It walks through the whole JSON regardless what you actually need (perhaps, just the first element), and (2) it converts the JSON into a sort of TABLE, and my question is how would the result be when the JSON has, say, 6 different levels of hierarchy (e.g. arrays within arrays within arrays....). – FDavidov Feb 07 '17 at 17:25

2 Answers2

1

Can be done with some strategic parse/split manipulation

Sample Data

Declare @S varchar(max) ='
{
    "Obj1": {
        "Obj1_S_1": [{
            "Obj1_S_1_S_1": "Blabla_1"
        }, {
            "Obj1_S_1_S_1": "Blabla_2"
        }, {
            "Obj1_S_1_S_1": "Blabla_3"
        }, {
            "Obj1_S_1_S_1": "Blabla_4"
        }, {
            "Obj1_S_1_S_1": "Blabla_5"
        }, {
            "Obj1_S_1_S_1": "Blabla_6"
        }]
    },
    "Obj2": "This is a simple string",
    "Obj3": "456.33"
}
'

Example

--Clean-up JSON String and add '|||' as a standard delimeter
Select @S = Replace(@S,MapFrm,MapTo)
 From (values ('"'     ,'')
             ,(char(13),'|||')
             ,(char(10),'|||')
             ,('}'     ,'|||')
             ,('{'     ,'|||')
             ,('['     ,'|||')
             ,(']'     ,'|||')
       ) b (MapFrm,MapTo)

Option with a Parse/Split UDF

Select Item  = left(RetVal,charindex(':',RetVal+':')-1)
      ,Value = ltrim(right(RetVal,len(RetVal)-charindex(':',RetVal+':')))
 From  [dbo].[udf-Str-Parse](@S,'|||')
 Where Len(IsNull(RetVal,' '))>1
 Order By RetSeq

Option without a Parse/Split UDF

Select Item  = left(RetVal,charindex(':',RetVal+':')-1)
      ,Value = ltrim(right(RetVal,len(RetVal)-charindex(':',RetVal+':')))
 From  (
        Select RetSeq = Row_Number() over (Order By (Select null))
              ,RetVal = LTrim(RTrim(B.i.value('(./text())[1]', 'varchar(max)')))
        From  (Select x = Cast('<x>' + replace((Select replace(@S,'|||','§§Split§§') as [*] For XML Path('')),'§§Split§§','</x><x>')+'</x>' as xml).query('.')) as A 
        Cross Apply x.nodes('x') AS B(i)
       ) A
 Where Len(IsNull(RetVal,' '))>1
 Order By RetSeq

Both Return

Item            Value
Obj1    
Obj1_S_1    
Obj1_S_1_S_1    Blabla_1
Obj1_S_1_S_1    Blabla_2
Obj1_S_1_S_1    Blabla_3
Obj1_S_1_S_1    Blabla_4
Obj1_S_1_S_1    Blabla_5
Obj1_S_1_S_1    Blabla_6
Obj2            This is a simple string,
Obj3            456.33

The UDF if needed

CREATE FUNCTION [dbo].[udf-Str-Parse] (@String varchar(max),@Delimiter varchar(10))
Returns Table 
As
Return (  
    Select RetSeq = Row_Number() over (Order By (Select null))
          ,RetVal = LTrim(RTrim(B.i.value('(./text())[1]', 'varchar(max)')))
    From  (Select x = Cast('<x>' + replace((Select replace(@String,@Delimiter,'§§Split§§') as [*] For XML Path('')),'§§Split§§','</x><x>')+'</x>' as xml).query('.')) as A 
    Cross Apply x.nodes('x') AS B(i)
);
--Thanks Shnugo for making this XML safe
--Select * from [dbo].[udf-Str-Parse]('Dog,Cat,House,Car',',')
--Select * from [dbo].[udf-Str-Parse]('John Cappelletti was here',' ')
--Select * from [dbo].[udf-Str-Parse]('this,is,<test>,for,< & >',',')
John Cappelletti
  • 79,615
  • 7
  • 44
  • 66
  • Thank you John for your post, but I see few issues with it: (1) you are proposing to change both `{` / `}` and `[` / `]` to `|||` hence losing the distinction between a **structure** and an **array** (potentially of structures); (2) Imagine that there is hierarchy such that (for instance) `Obj2` is also an array much like `Obj1`, and you need to fish specific elements from each of both arrays. Any solution **must** honor the original structure of the JSON, and I don't see how your proposal does it. Perhaps, I'm missing something... am I? – FDavidov Feb 08 '17 at 05:55
1

See my response here where I created a function compatible with SQL 2012 that extracts values given the JSON and the column name.

SET ANSI_NULLS ON
GO
SET QUOTED_IDENTIFIER ON
GO
-- =============================================
-- Author:      Isaac Adams
-- Create date: 7/12/2018
-- Description: Give the JSON string and the name of the column from which you want the value
-- =============================================
CREATE FUNCTION JSON_VALUE
(
    @JSON NVARCHAR(3000),
    @column NVARCHAR(3000)
)
RETURNS NVARCHAR(3000)
AS
BEGIN

DECLARE @value NVARCHAR(3000);
DECLARE @trimmedJSON NVARCHAR(3000);

DECLARE @start INT;
DECLARE @length INT;

SET @start = PATINDEX('%' + @column + '":"%',@JSON) + LEN(@column) + 3;
SET @trimmedJSON = SUBSTRING(@JSON, @start, LEN(@JSON));
SET @length = PATINDEX('%", "%', @trimmedJSON);
SET @value = SUBSTRING(@trimmedJSON, 0, @length);

RETURN @value
END
GO
Isaac Adams
  • 150
  • 4
  • While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. – Morse Jul 13 '18 at 01:38
  • @Prateek very good point. I edited the response to include the important aspect. I will keep that in mind for future responses. – Isaac Adams Jul 13 '18 at 16:51
  • Thank you @IsaacAdams for your post. Fortunately, since my original post the company decided to switch to SQL Server 2016 which has already some very powerful native support for JSON consumption. I strongly suggest you take a look at it (if relevant for you) because it performs over 200 times faster than any code I wrote. Just to give you some orders of magnitude, when fed with an 80KB JSON that includes a array, it converts the whole array into a table in just 17ms. I don't think this can be achieved with SQL language. Cheers!! – FDavidov Jul 14 '18 at 13:55
  • @FDavidov that is actually incredible! Thanks for letting me know -- this is just another good reason to get my company to upgrade. – Isaac Adams Jul 16 '18 at 14:11