3

I am running a .NET 4.0 application that communicates using LINQ to SQL with SQL Server 2008 R2 Standard edition. I have a table with a column of type nvarchar(max), and the application populates that column with strings as part of its operation.

For strings of sizes > ~30 MB we find that once the insertion\update is complete, a part of the string gets truncated and what is stored in the server is not the full string. The thing is the strings get truncated at variable positions (we are still left with over 30 MB of data after the truncation), so there isn't some fixed point that can lead me in the direction of some size limitation (though this still may be the case).

I am not seeing any errors in insertion time, although at times I notice that during such long insertions\updates the SQL server closes the connection - But shouldn't the operation get rolled back in this case?

Would appreciate some ideas. Not sure how to proceed.

Erik Philips
  • 53,428
  • 11
  • 128
  • 150
user181218
  • 1,655
  • 5
  • 28
  • 42
  • You may need to store that as text rather than nvarchar(max). Depending on what other data you have in that row you might be bumping up against the total row width. Have a look at https://technet.microsoft.com/en-us/library/ms186981%28v=sql.105%29.aspx – LordBaconPants Apr 28 '15 at 22:52
  • 4
    @LordBaconPants the text and ntext datatypes are deprecated and shoud not be used for new implementations. MS clearly states that you shoud use varchar(max) and nvarchar(max) – Stavros Zotalis Apr 28 '15 at 22:58
  • Why would you store 30mb of string data in a single db column? Seems like you have modeling issue here on the database. Also, it seems to me your truncation issue may still be in your application. If you are using web services, there is likely a send and receive limitation, like in the case of a WCF service for instance: http://stackoverflow.com/questions/884235/wcf-how-to-increase-message-size-quota – maplemale Apr 28 '15 at 22:59
  • Any reason not to check the size of your strings before you insert them if they are so long? Then you could split them before the insertion. – Rufus L Apr 28 '15 at 23:02
  • maplemale I agree that the design here is poor, but I'm afraid it's a given that I have to deal with. I'll check the direction you suggested. @RufusL How am I supposed to split the string? You can either do an insert or an update, don't you? Even the concat operation would have to copy the whole thing. – user181218 Apr 28 '15 at 23:02
  • @user181218 Maybe I'm missing something about how your system works, but my idea was to take the `>30MB` string, chunk it into smaller strings, and then insert each of those smaller strings into a new row. This may require some schema change in SQL, where a table `myTable` that used to have a single `largeString` column may need a joining table `myTableLargeStrings`. Then `myTable` could remove the `largeString` column and clients would query something like `'SELECT largeString FROM myTableLargeStrings WHERE myTableLargeStrings.myTableId = myTable.Id'`. – Rufus L Apr 28 '15 at 23:35
  • Thanks @RufusL. Anyhow right now this is not an option + I personally have a problem bypassing issues whose origin I do not understand. – user181218 Apr 28 '15 at 23:49
  • 1
    How did you determine that the string was truncated? Management Studio can be misleading if that is what you are using to make the determination. – Jason Boyd Apr 29 '15 at 02:03
  • Show us your code. – RBarryYoung Oct 17 '16 at 18:08

2 Answers2

3

I would suspect it's not SQL Server or even the connection inserting the data. I wonder if it is corruption of the data when setting the command parameter value or even within the application. I've experimented with connection and command timeouts to induce them while inserting, and your assumption that it would rollback the canceled implicit transaction is correct. Here's an example:

// Use connection with timeout of 1 second to generate a timeout when inserting
using (var conn = new SqlConnection(@"Data Source=(localdb)\mssqllocaldb;Integrated Security=SSPI;Initial Catalog=tempdb;Connection Timeout=1"))
using (var cmd = conn.CreateCommand())
{
    cmd.CommandTimeout = 1;
    conn.Open();
    cmd.CommandText = @"if not exists(select * from sys.tables where name = 'LongStringTruncation')
                        begin
                            create table LongStringTruncation (Data nvarchar(max));
                        end";
    cmd.ExecuteNonQuery();

    cmd.CommandText = "insert LongStringTruncation values (@t)";
    var t = cmd.CreateParameter();
    t.DbType = DbType.String;
    t.ParameterName = "@t";
    t.Value = new String('A', 30000000); // 30,000,000 chars = 60MB
    cmd.Parameters.Add(t);
    cmd.ExecuteNonQuery();
}

When this succeeds (i.e. the query completes before timeout); the following query shows all 30,000,000 characters were transmitted and inserted; when it fails with a timeout, the table is empty.

select len(Data) from LongStringTruncation;

Note that I've used SQL 2014 here, so it could be there is a bug in SQL 2008 R2, which this test could uncover. My test also uses parameters, and if you are instead using string concatenation to build your SQL, this could be another source of problems.

If neither of these explain the issue, then the only conclusion that makes sense is that the application itself is truncating the data in some way before inserting it. Given the random nature, I'd consider that the way you get the string data (e.g. a buffered stream without flushing before reading the result), or some multithreaded race when getting the value / signaling that it is complete would be my first suspects.

codekaizen
  • 26,990
  • 7
  • 84
  • 140
0

I was having same issue. Resolved by changing data type of the input parameters in Stored Procedure.

Try checking few following things again: Data type of input parameter of the Stored Procedure should be same as of table column datatype where you are inserting the value.

Also check the same in you C# code.

Datatype and its length For Ex: VARCHAR(50) should be same for SP, Table and code.