2

I'm using an azure sql database (v12) on a vm. I have two different databases instances - one for staging and one for production. I'm trying to grab the data off of staging and insert it into production with the click of a button. This code works successfully 'sometimes' meaning randomly it will be successful. Otherwise I'm getting back an error of:

BULK COPY Commit Exception Type: {0}System.Data.SqlClient.SqlException BULK COPY Message: {0}Timeout expired. The timeout period elapsed prior to completion of the operation or the server is not responding. This failure occurred while attempting to connect to the routing destination. The duration spent while attempting to connect to the original server was - [Pre-Login] initialization=1; handshake=17; [Login] initialization=0; authentication=0; [Post-Login] complete=0;

Here is the code that I'm using to accomplish this task, maybe there's a flaw that I'm not seeing. By dump the StringBuilder out I can see that the SELECT query works and the DELETE query works but the error is thrown when I attempt to copy the data using SqlBulkCopy. Any help would be greatly appreciated. I've gone through a bunch of the MSDN docs already with no luck -> adding longer CommandTimeouts, adding a longer BulkCopyTimeout, and re-configuring ports on my firewall. Still no luck.

Resources I've used: https://social.msdn.microsoft.com/Forums/en-US/1467d64f-69ae-4c1f-91a2-349fc5d514ae/sqlbulkcopy-fails-with-timeout-expired-error?forum=adodotnetdataproviders

https://azure.microsoft.com/nb-no/documentation/articles/sql-database-develop-direct-route-ports-adonet-v12/

Timeout expired with SqlBulkCopy

public static object SyncData()
{
    StringBuilder sb = new StringBuilder();
    sb.AppendLine("Internal Connection...");
    string internalConnectionString = GetConnectionString("ConnectionString");
    using (SqlConnection internalConnection = new SqlConnection(internalConnectionString))
    {
        internalConnection.Open();              
        SqlCommand selectCommand = internalConnection.CreateCommand();
        selectCommand.CommandTimeout = 180;
        try
        {
            selectCommand.CommandText = "SELECT * FROM dbo.test";
            SqlDataReader reader = selectCommand.ExecuteReader();

            sb.AppendLine("External Connection...");
            string externalConnectionString = GetConnectionString("ExternalConnectionString");
            using (SqlConnection externalConnection = new SqlConnection(externalConnectionString))
            {
                externalConnection.Open();              
                SqlCommand CRUDCommand = externalConnection.CreateCommand();
                CRUDCommand.CommandTimeout = 180;
                SqlTransaction transaction = externalConnection.BeginTransaction("test");
                CRUDCommand.Connection = externalConnection;
                CRUDCommand.Transaction = transaction;
                try
                {
                    CRUDCommand.CommandText = "DELETE FROM dbo.test";
                    sb.AppendLine("DELETE: Number of rows affected = " + CRUDCommand.ExecuteNonQuery());
                    using (SqlBulkCopy bulkCopy = new SqlBulkCopy(externalConnection, SqlBulkCopyOptions.KeepIdentity, transaction))
                    {
                        try
                        {
                            bulkCopy.DestinationTableName = "dbo.test";
                            bulkCopy.BatchSize = 100;
                            bulkCopy.BulkCopyTimeout = 180;
                            bulkCopy.WriteToServer(reader);

                            sb.AppendLine("Table data copied successfully");

                            transaction.Commit();
                            sb.AppendLine("Transaction committed.");
                        }
                        catch (Exception ex)
                        {
                            sb.AppendLine("BULK COPY Commit Exception Type: {0}" + ex.GetType());
                            sb.AppendLine("  BULK COPY Message: {0}" + ex.Message);
                            try
                            {
                                transaction.Rollback();
                            }
                            catch (Exception ex2)
                            {
                                sb.AppendLine("Rollback Exception Type: {0}" + ex2.GetType());
                                sb.AppendLine("  Message: {0}" + ex2.Message);
                            }
                        }
                        finally
                        {
                            reader.Close();
                        }
                    }
                }
                catch (Exception ex)
                {
                    sb.AppendLine("Commit Exception Type: {0}" + ex.GetType());
                    sb.AppendLine("  Message: {0}" + ex.Message);

                    try
                    {
                        transaction.Rollback();
                    }
                    catch (Exception ex2)
                    {
                        sb.AppendLine("Rollback Exception Type: {0}" + ex2.GetType());
                        sb.AppendLine("  Message: {0}" + ex2.Message);
                    }
                }
            }
        }
        catch (Exception ex)
        {
            sb.AppendLine("Commit Exception Type: {0}" + ex.GetType());
            sb.AppendLine("  Message: {0}" + ex.Message);
        }
    }
    return sb.ToString();
}
Community
  • 1
  • 1
c22joe
  • 43
  • 8
  • Are both servers on the same virtual network? Did you try experimenting with the timeouts and perhaps increasing the batch size so there are not as many round trips? I have a similar process though I am using Dapper, Reactive Extensions and Simple.Data to accomplish the same thing. My in my case I just had to play around with the settings until I found the best results. Also I have retry logic which helps but there is no silver bullet IMHO. – jcwrequests Oct 07 '16 at 19:23
  • yes both databases are on the same azure server. i've tried increasing my timeouts up to 30min, but that just makes them hang for 30min... The table im copying right now is only 400 rows so altering the batch size has had no affect for me. – c22joe Oct 07 '16 at 19:48
  • Is the process running the code on the same virtual network? – jcwrequests Oct 07 '16 at 19:51

1 Answers1

1

When creating your SqlBulkCopy instance, you're passing the connection string externalConnectionString and thus opening a new connection. That may be causing a deadlock issue with both connections trying to modify the same table.

Have you tried passing your existing connection externalConnection to the SqlBulkCopy constructor instead of the connection string?

sakai
  • 163
  • 7
  • Good idea, I'll try that out. – c22joe Oct 07 '16 at 19:49
  • You're so right, when I instantiated SqlBulkCopy I locked up the table due to my previous DELETE query of that same table. All I had to do was change the constructor to `using (SqlBulkCopy bulkCopy = new SqlBulkCopy(externalConnection, SqlBulkCopyOptions.KeepIdentity, transaction))` and it worked perfectly. Thank you! – c22joe Oct 07 '16 at 20:21