1

I can run the following commands without any problems on my local SQL Server machine:

exec sp_configure 'show advanced options', 1
reconfigure
go

exec sp_configure 'Ad Hoc Distributed Queries', 1
reconfigure
go

exec LoadData 'C:\MyDataFile.urg';
go

But when I try to run the SP_CONFIGURE commands on Azure SQL, I get the following error:

Statement 'CONFIG' is not supported in this version of SQL Server.

And when I execute the Load data command I get the following error

Cannot bulk load because the file C:\MyDataFile.urg" could not be opened. Operating system error code (null).

The above error makes sense, since I am trying to access a file on my local machine from Azure cloud. Is there an equivalent process to Load data that I can follow in Azure to dump the contents of the file?

I can place the file in Azure blob but then what command I execute that will work similar to load data?

-- Update 1 Please keep in mind two things when answering 1) I am using a third party file that ends with .urg and is not a csv file.

2) When I use exec LoadData 'C:\MyDataFile.urg'; Note that I am not using a table name where file data will go to. LoadData command executes the file and dumps the data in respective files itself. I am assuming that .urg files gets opened and executes and has commands in it to know what data goes where.

--Update 2 So my understanding was incorrect. Found out that LoadData is a stored proc that the third party is using that takes the path to the file like this. File on the disk works great, I need to send it azure storage blob path.

CREATE PROCEDURE [dbo].[LoadData]
    @DataFile NVARCHAR(MAX)
AS 
DECLARE @LoadSql NVARCHAR(MAX)
SET @LoadSql = '
BULK INSERT UrgLoad FROM '''  + @DataFile + '''
    WITH (
        FIRSTROW = 2,
        FIELDTERMINATOR = ''~'',
        ROWTERMINATOR = ''0x0a'',
        KEEPNULLS,
        CODEPAGE = ''ACP''
    )
'

EXEC sp_executesql @LoadSql
SELECT @Err = @@ERROR

Now I need to find a way to send a azure storage blob path to this stored proc in such a way that it can open it. I will update if I get into issues.

--Update 3 Since my blob storage account is not public, I am sure I need to add authorization piece. I added this piece of code to the proc

 CREATE DATABASE SCOPED CREDENTIAL MyAzureBlobStorageCredential 
     WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
     SECRET = 'sp=r&st=2020-03-10T01:04:16Z&se=2020-03-10T09:04:16Z&spr=https&sv=2019-02-02&sr=b&sig=Udxa%2FvPrUBZt09GAH4YgWd9joTlyxYDC%2Bt7j7CmuhvQ%3D';

    -- Create external data source with the URL of the Blob storage Account and associated credential since its not public
    CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
    WITH (  TYPE = BLOB_STORAGE, 
            LOCATION = 'https://dev.blob.core.windows.net/urg', 
            CREDENTIAL= MyAzureBlobStorageCredential    
    );

When I execute the proc says it already exists.

Msg 15530, Level 16, State 1, Procedure LoadData, Line 14 [Batch Start Line 1]
The credential with name "MyAzureBlobStorageCredential" already exists.
Msg 46502, Level 16, State 1, Procedure LoadData, Line 27 [Batch Start Line 1]
Type with name 'MyAzureBlobStorage' already exists.

When I take it out and I updated the Bulk insert piece of code like this

DECLARE @LoadSql NVARCHAR(MAX)
SET @LoadSql = '
BULK INSERT UrjanetLoad FROM '''  + @DataFile + '''
    WITH ( DATA_SOURCE = ''MyAzureBlobStorage'',
        FIRSTROW = 2,
        FIELDTERMINATOR = ''~'',
        ROWTERMINATOR = ''0x0a'',
        KEEPNULLS,
        CODEPAGE = ''ACP''
    )
'

But it tells me

Cannot bulk load because the file "https://dev.blob.core.windows.net/urg/03_06_20_16_23.urg" could not be opened. Operating system error code 5(Access is denied.).

I guess the question is what am I missing in the authorization process and how can I make it a part of a stored proc I guess, so whenever it runs, it picks it.

Update 4: This article helped in accessing file from blob storage using credentials and dropping external data source and scoped credentials and getting a fresh SAS token in the stored proc, in case it can help someone else `https://social.technet.microsoft.com/wiki/contents/articles/52061.t-sql-bulk-insert-azure-csv-blob-into-azure-sql-database.aspx

Now I am getting error

Cannot bulk load because the file "03_06_20_16_23.urg" could not be opened. Operating system error code 32(The process cannot access the file because it is being used by another process.).

Tried this article but that does not address the file being in use by another process issue.

Update 5: Here is how the proc looks like

alter PROCEDURE [dbo].[TestLoad]
    @DataFile NVARCHAR(MAX), @SAS_Token  VARCHAR(MAX),@Location  VARCHAR(MAX)
AS 

BEGIN TRAN
    -- Turn on NOCOUNT to prevent message spamming
    SET NOCOUNT ON;

    DECLARE @CrtDSSQL NVARCHAR(MAX), @DrpDSSQL NVARCHAR(MAX), @ExtlDS SYSNAME, @DBCred SYSNAME, @BulkInsSQL NVARCHAR(MAX) ;

     SELECT @ExtlDS = 'MyAzureBlobStorage'
     SELECT @DBCred = 'MyAzureBlobStorageCredential'

     SET @DrpDSSQL = N'
     IF EXISTS ( SELECT 1 FROM sys.external_data_sources WHERE Name = ''' + @ExtlDS + ''' )
     BEGIN
     DROP EXTERNAL DATA SOURCE ' + @ExtlDS + ' ;
     END;

     IF EXISTS ( SELECT 1 FROM sys.database_scoped_credentials WHERE Name = ''' + @DBCred + ''' )
     BEGIN
     DROP DATABASE SCOPED CREDENTIAL ' + @DBCred + ';
     END;
     ';

     SET @CrtDSSQL = @DrpDSSQL + N'
     CREATE DATABASE SCOPED CREDENTIAL ' + @DBCred + '
     WITH IDENTITY = ''SHARED ACCESS SIGNATURE'',
     SECRET = ''' + @SAS_Token + ''';

     CREATE EXTERNAL DATA SOURCE ' + @ExtlDS + '
     WITH (
     TYPE = BLOB_STORAGE,
     LOCATION = ''' + @Location + ''' ,
     CREDENTIAL = ' + @DBCred + '
     );
     ';

    -- PRINT @CrtDSSQL
     EXEC (@CrtDSSQL);


    -- Set up the load timestamp
    DECLARE @LoadTime DATETIME, @Err varchar(60)
    SELECT @LoadTime = GETDATE()

    -- Set the bulk load command to a string and execute with sp_executesql.  
    -- This is the only way to do parameterized bulk loads 
    DECLARE @LoadSql NVARCHAR(MAX)
    SET @LoadSql = '
    BULK INSERT TestLoadTable FROM '''  + @DataFile + '''
        WITH ( DATA_SOURCE = ''MyAzureBlobStorage'',
            FIRSTROW = 2,
            FIELDTERMINATOR = ''~'',
            ROWTERMINATOR = ''0x0a'',
            KEEPNULLS,
            CODEPAGE = ''ACP''
        )
    '
    EXEC (@LoadSql);
    --EXEC sp_executesql @LoadSql
    SELECT @Err = @@ERROR

    IF @Err <> 0 BEGIN
        PRINT 'Errors with  data file ... aborting'
        ROLLBACK
        RETURN -1
    END
SET NOCOUNT OFF;
COMMIT
GO

And this is how I am trying to call it.

 EXEC TestLoad 'TestFile.csv',
'sv=2019-02-02&ss=bfqt&srt=sco&sp=rwdlacup&se=2020-03-16T02:07:03Z&st=2020-03-10T18:07:03Z&spr=https&sig=TleUPwAyEVT6dzX17fH6rq1lQQRAhIRImDHdJRKIrKE%3D', 
''https://dev.blob.core.windows.net/urg';

and here is the error

Cannot bulk load because the file "TestFile.csv" could not be opened. Operating system error code 32(The process cannot access the file because it is being used by another process.).
Errors with  data file ... aborting
Sarah
  • 1,199
  • 2
  • 21
  • 42

2 Answers2

0

If your file is placed on a public Azure Blob Storage account, you need to define EXTERNAL DATA SOURCE that points to that account:

CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
 WITH ( TYPE = BLOB_STORAGE, LOCATION = 'https://myazureblobstorage.blob.core.windows.net');

Once you define external data source, you can use the name of that source in BULK INSERT and OPENROWSET.

CREATE MASTER KEY ENCRYPTION BY PASSWORD = 'some strong password';
CREATE DATABASE SCOPED CREDENTIAL MyAzureBlobStorageCredential
 WITH IDENTITY = 'SHARED ACCESS SIGNATURE',
 SECRET = 'sv=2015-12-11&ss=b&srt=sco&sp=rwac&se=2017-02-01T00:55:34Z&st=2016-12-29T16:55:34Z&spr=https&sig=copyFromAzurePortal';
CREATE EXTERNAL DATA SOURCE MyAzureBlobStorage
 WITH ( TYPE = BLOB_STORAGE,
       LOCATION = 'https://myazureblobstorage.blob.core.windows.net',
       CREDENTIAL= MyAzureBlobStorageCredential);
David Morin
  • 485
  • 3
  • 5
  • 16
  • 1
    Thanks David for helping here, I did look up the example mentioned here.. https://learn.microsoft.com/en-us/sql/relational-databases/import-export/examples-of-bulk-access-to-data-in-azure-blob-storage?view=sql-server-ver15 Two problems I have, file format is not .csv and Load data command does not specify which table the data is going in, I call it just on the database. If I follow your example and the Microsoft example, I need .csv file format and a specified table to dump data in. – Sarah Mar 10 '20 at 01:10
0

According my experience and all of the Azure SQL Database documents, we just can answer you that:

  1. Azure SQL database doesn't support load file from on-premise/local computer directly.

Ref: enter image description here

  1. Azure SQL database doesn't support the .urg data file. We can not find any way support the urg file. Even Data Factory doesn't.

Reference:

  1. Limitions: Only .mdf, .ldf, and .ndf files can be stored in Azure Storage by using the SQL Server Data Files in Azure feature.
  2. Data formats for import and export

Update:

I don't know if the urg file will be loaded successfully, but I find some ways you could try it:

  1. You could upload your urg file to Blob storage firstly, then reference this tutorial: Importing data from a file in Azure blob storage.
  2. Here's an another blog Bulk insert using stored procedure and Bulk insert file path as stored procedure parameter can help you pass the bulk insert file path as the parameter to the stored procedure 'LoadData'.

Hope this helps.

Leon Yue
  • 15,693
  • 1
  • 11
  • 23
  • thanks much for your input. please see my latest update. – Sarah Mar 10 '20 at 06:21
  • @Sarah I updated something in my answer, hope this could be helpful for you. – Leon Yue Mar 10 '20 at 07:26
  • thank you so very much. I think I am pretty close to solving it just dont know in the Buik Insert statement I am mentioning the Data Source of blob storage and I am getting errors if I add external data source (Not sure if I need to add that in proc) Please see Update 3 – Sarah Mar 10 '20 at 17:11
  • @Sarah That return to the first question, Azure SQL doesn't support import org file. You can test with csv /txt file. If it succeed, then we can prove that. I also tried SSMS import, all failed. – Leon Yue Mar 11 '20 at 01:47
  • thanks! At this point It seems like its accessing of file from the blob storage is the issue not the type of file as the error suggests. I tried using csv file, txt file. I am uploading the code I have on update 5. All are giving the following error. "Cannot bulk load because the file "Test.csv" could not be opened. Operating system error code 32(The process cannot access the file because it is being used by another process.)." – Sarah Mar 11 '20 at 15:53
  • Leon, After my update 5, I made a bit of a progress, turns out if I take BEGIN TRAN and Commit lines out the process cannot access the file because it is being used by another process goes away and the proc works. Now I don't know why Begin Tran and Commit wont work with a file being accessed from Azure blob storage and why starting a transaction gets into another process error. – Sarah Mar 11 '20 at 17:48