0

I have written a function to upload a .csv file from a local file system to Azure Blob storage. The function is being used to upload four csv files. Three of the files upload without any issues (their size ranges from 3 to 75Mb); however one file (the third of the four files) raises the following exception - The operation did not complete (write) (_ssl.c:2158)

This questionable file is able to be uploaded manually from the local file system to the storage container without any issue, so there is no question of the file being inappropriate or corrupted.

The function is given below:

def export_to_blob(source_path,source_file_name):
    try:
        connect_str = os.getenv('AZURE_STORAGE_CONNECTION_STRING')

        # Create the BlobServiceClient object which will be used to create a container client
        blob_service_client = BlobServiceClient.from_connection_string(connect_str)

        container_name = 'data-warehouse-blob'

        # Define the container
        container_client = blob_service_client.get_container_client(container_name)

        # Create a file in local data directory to upload and download
        local_path = source_path
        local_file_name = source_file_name
        upload_file_path = os.path.join(local_path, local_file_name)

        # Create a blob client using the local file name as the name for the blob
        blob_client = blob_service_client.get_blob_client(container=container_name, blob=local_file_name)

        print("\nUploading to Azure Storage as blob:\n\t" + local_file_name)

        # Upload the created file
        with open(upload_file_path, "rb") as data:
            blob_client.upload_blob(data,overwrite=True)

    except Exception as ex:
        print('Exception:')
        print(ex)

Any explination of this rather cryptic exception message would be gratefully received.

EDIT: I've enabled logging and the log details are as follows:

Request URL: 'https://orrblobstorage001.blob.core.windows.net/data-warehouse-blob/DW_327_RELIABILITY.csv'
Request method: 'PUT'
Request headers:
    'Content-Type': 'application/octet-stream'
    'Content-Length': '48570614'
    'x-ms-version': '2019-07-07'
    'x-ms-blob-type': 'BlockBlob'
    'x-ms-date': 'Mon, 13 Jul 2020 13:33:00 GMT'
    'x-ms-client-request-id': '5ed935fa-c50d-11ea-8108-9061ae394c51'
    'User-Agent': 'azsdk-python-storage-blob/12.3.2 Python/3.6.8 (Windows-10-10.0.17134-SP0)'
    'Authorization': '*****'
Request body:
b'Base_Location,TOC,Criticality,Location,Location_Type,Natural_Frequency,Data_Type,Option_1,Option_2,Option_3,Option_4,Option_5,min_value,max_value,Date,value\r\n

[[snipped the data returned here]]

DU_Two,MDU_Criticality,Period,Service_Affecting_Failures,Track,Track failures (service affecting),x,x,x,2.0,19.0,2020-05-30,2.0\r\n'

When I reduced the amount of data being uploaded from 50MB to 35 MB the upload works without any issue, but other files at 75MB are being uploaded without issue. The DEBUG log is as follows. It appears that the first log simply dies without any warning or error message, but the smaller file gets a response.

What is going on here?

Request URL: 'https://orrblobstorage001.blob.core.windows.net/data-warehouse-blob/DW_327_RELIABILITY.csv'
Request method: 'PUT'
Request headers:
    'Content-Type': 'application/octet-stream'
    'Content-Length': '36368360'
    'x-ms-version': '2019-07-07'
    'x-ms-blob-type': 'BlockBlob'
    'x-ms-date': 'Mon, 13 Jul 2020 13:53:50 GMT'
    'x-ms-client-request-id': '481fbd0c-c510-11ea-bf7c-9061ae394c51'
    'User-Agent': 'azsdk-python-storage-blob/12.3.2 Python/3.6.8 (Windows-10-10.0.17134-SP0)'
    'Authorization': '*****'
Request body:
b'Base_Location,TOC,Criticality,Loc
[[snipped the data returned here]]
x,x,x,2.0,17.0,2020-05-30,2.0\r\n'
Response status: 201
Response headers:
    'Content-Length': '0'
    'Content-MD5': 'AVRkDNBgBIdL5/bgiJQfgg=='
    'Last-Modified': 'Mon, 13 Jul 2020 13:54:08 GMT'
    'ETag': '"0x8D8273436B6BA9F"'
    'Server': 'Windows-Azure-Blob/1.0 Microsoft-HTTPAPI/2.0'
    'x-ms-request-id': '85868dcc-a01e-0092-031d-5926f6000000'
    'x-ms-client-request-id': '481fbd0c-c510-11ea-bf7c-9061ae394c51'
    'x-ms-version': '2019-07-07'
    'x-ms-content-crc64': 'YgACieEqdpI='
    'x-ms-request-server-encrypted': 'true'
    'Date': 'Mon, 13 Jul 2020 13:54:07 GMT'
Response content:

 
Greg Williams
  • 169
  • 1
  • 14
  • Could you please provide the detailed error message? – Jim Xu Jul 07 '20 at 03:37
  • The only error message I can see is this exception message. Is there a way to see a more detailed error message? Sorry, but this is my first ever day with Azure... – Greg Williams Jul 07 '20 at 08:15
  • You can try to turn on log : https://learn.microsoft.com/en-us/azure/developer/python/azure-sdk-logging – Jim Xu Jul 07 '20 at 08:36
  • Apologies for delay in reponse; something got in the way at work before I could return to this. It seems to me that the log isn't much more illuminating but fun to learn something new. – Greg Williams Jul 13 '20 at 15:36
  • According to your response body status code, you have successfully uploaded file :https://learn.microsoft.com/en-us/rest/api/storageservices/put-blob#status-code – Jim Xu Jul 14 '20 at 01:17
  • The second log file is a successful upload, but the first log file is without a response body code. The second log file is successful because I've reduced the size of the file by a third! What I'm trying to understand is why this fails to load when the file is 53Mb, but succeeds when it's 74MB. Other files at 75MB are uploaded without an issue. – Greg Williams Jul 14 '20 at 07:47
  • According to the situation, I suggest you upload the large CSV file in chunk. For more details, please refer to [my previous answer](https://stackoverflow.com/questions/62695837/failing-to-upload-larger-blobs-to-azure-azure-core-exceptions-servicerequesterr/62767062#62767062) or the [sample](https://github.com/Azure/azure-sdk-for-python/blob/17f2c17358aee79d2fa949e6470323d3b94929c8/sdk/storage/azure-storage-blob/tests/test_block_blob.py#L132) – Jim Xu Jul 14 '20 at 08:13
  • Do you have any update? – Jim Xu Jul 15 '20 at 08:16
  • My apologies. I'm on leave now until 27th July and i don't have access to my work laptop. I think your suggestion is sensible and I'll try to implementing it on my return. For the moment, the work around is to load a smaller file. Your help is appreciated. – Greg Williams Jul 16 '20 at 15:57

0 Answers0