0

I am trying to insert huge content in the format of JSON in a table column. At the moment it throws no error however when displaying the data in the column, it only prints half of the entered details. How can I increase the length of nvarchar(max) to nvarchar(300,000).

At the moment, I'm using SQL Server 15.0 version therefore I cannot use JSON data type.

I would appreciate any help on this.

Dale K
  • 25,246
  • 15
  • 42
  • 71
  • 6
    This isn't an issue with the size of the datattpe but how SQL server displays results https://stackoverflow.com/questions/11897950/how-do-you-view-all-text-from-an-ntext-or-nvarcharmax-in-ssms – Matt Evans Dec 07 '20 at 08:43
  • 4
    `nvarchar(max)`, as the name suggests, is the largest `nvarchar` you can have. There's no bigger type, and as Matt says, all you're experiencing is a *display* issue, not a data storage one. – Damien_The_Unbeliever Dec 07 '20 at 08:46
  • 1
    `nvarchar(max)` should be 2 gigabytes. If you need more space than that, consider breaking up that JSON. You should probably break it up anyway. – Schwern Dec 07 '20 at 08:50
  • "* it only prints half of the entered details*" - SQL Server doesn't print anything. It's the SQL client tool you use to run your SQL queries that displays this. So the question is about the tool you use, not about the data type. Which tool **do** you use to run your queries? –  Dec 07 '20 at 10:12

1 Answers1

0

For the display issue, there used to be a handy "bug" in the results grid, in SSMS, swapping the position of an XML column and clicking on it, would display the contents of the column (left or right) of the XML*(this depends on the initial position of the xml in the resultset)

declare @json nvarchar(max) = concat('["a"', replicate(cast(concat(',"', newid(), '"') as nvarchar(max)), 10000), ']');
select isjson(@json) as _isjson, len(@json) as _len;

select 
'<![CDATA[content start-->'+ @json + ' <-- content end]]>' as colA, 
cast('drag me to the left of colA and click me' as xml) as b;

CDATA is not actually needed if colA does not contain any breaking & illegal XML characters.

declare @longchar nvarchar(max) = concat('"a"', replicate(cast(concat(',"', newid(), '"') as nvarchar(max)), 10000));

select 
@longchar as colA, 
cast('drag me to the left of cola and click me' as xml) as b;
lptr
  • 1
  • 2
  • 6
  • 16
  • Seems simpler to just do this rather than having to return an additional column and drag columns around https://stackoverflow.com/a/8930141/73226 – Martin Smith Dec 07 '20 at 09:21
  • Or for the case where there is nothing to escape can just do `select @longchar as [XML_F52E2B61-18A1-11d1-B105-00805F49916B]` – Martin Smith Dec 07 '20 at 09:30
  • ..@MartinSmith..it depends on the situation. for a query that returns x longchar columns (none with breaking xml chars), either change all x cols to xml or keep the query as is and swap a single xml column around. anything that works is just fine. – lptr Dec 07 '20 at 09:54
  • @MartinSmith .... for the processing-instruction, won't this break the display? `select N'?> abc<>' as [processing-instruction(x)] FOR XML PATH ` – lptr Dec 07 '20 at 10:20
  • looks like it does - but that won't appear in an input of JSON documents – Martin Smith Dec 07 '20 at 10:29
  • I am not considering the print issue right now. The main problem is storage issue. I cant store. Thank you for the post. – Ravi Koppula Dec 07 '20 at 11:33
  • ..@RaviKoppula .. if it were possible to define a nvarchar(300K) column, then still, storage wise, nvarchar(max) > nvarchar(300K). If your column is indeed nvarchar(max) then it is the maximum, it can store more than 300K characters and it should not be a concern. You could try and check how many chars are stored in your table: `select len(concat(columnname, '.'))-1 as numberofchars, datalength(columnname) as bytes from tablename where x=z` – lptr Dec 07 '20 at 12:12