I'm doing a very simple bulk insert:
CREATE EXTERNAL DATA SOURCE mysource WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://xxxxxxx.blob.core.windows.net/zzzz');
BULK INSERT mytable FROM 'myfile.csv'
WITH (DATA_SOURCE = 'mysource',FORMAT='CSV',CODEPAGE = 65001,FIRSTROW = 2,TABLOCK,ROWTERMINATOR = '0x0a');
Which throws:
Msg 64, Level 20, State 0, Line 0 A transport-level error has occurred when receiving results from the server. (provider: TCP Provider, error: 0 - The specified network name is no longer available.)
More information:
- I'm using Microsoft SQL Server Management Studio 17.5
- Database is in Azure
- Storage is in Azure, and the blob has public access
- The current count in mytable is large: 3942767
myfile.csv
contains only 2 rows:
id,rbd,run,foo
"aaaabbbbb",4,0,5
In the past, this arrangement (storage + bulk insert) worked OK. Maybe this is happenning because the table is too large?
Notice:
This question has been identified as a possible duplicate of this: A transport-level error has occurred when receiving results from the server - well, they are very different. This is happening from SSMS directly sending the command to SQL Server. In that post, he is using a .NET application. Also this is happening exclusively with BULK INSERT
. Normal inserts work fine.