0

I'm triggering an Azure function with a blob trigger event. A container sample-workitems has a file base.csv and receives a new file new.csv. I'm reading base.csv from the sample-workitems and new.csv from InputStream for the same container.

def main(myblob: func.InputStream, base: func.InputStream):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")
    logging.info(f"Base file info \n"
                 f"Name: {base.name}\n"
                 f"Blob Size: {base.length} bytes")
                 
    df_base = pd.read_csv(BytesIO(base.read()))
    df_new = pd.read_csv(BytesIO(myblob.read()))
    print(df_new.head())
    print("prnting base dataframe")
    print(df_base.head())

Output:

Python blob trigger function processed blob 
Name: samples-workitems/new.csv
Blob Size: None bytes
Base file info 
Name: samples-workitems/new.csv
Blob Size: None bytes
first 5 rows of df_new (cannot show data here)
prnting base dataframe
first 5 rows of df_base (cannot show data here)

Even though both files show their own content when printing but myblob.name and base.name has the same value new.csv which is unexpected. myblob.name would have new.csv while base.name would have base.csv

function.json

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "myblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "samples-workitems/{name}",
      "connection": "my_storage"
    },
    {
      "type": "blob",
      "name": "base",
      "path": "samples-workitems/base.csv",
      "connection": "my_storage",
      "direction": "in"
    }
  ]
}
Shaida Muhammad
  • 1,428
  • 14
  • 25

1 Answers1

0

I have reproduced in my environment and the below code worked for me and I followed code of @SwethaKandikonda 's SO-thread

init.py:

import logging
from azure.storage.blob import BlockBlobService
import azure.functions as func


def main(myblob: func.InputStream):
    logging.info(f"Python blob trigger function processed blob \n"
                 f"Name: {myblob.name}\n"
                 f"Blob Size: {myblob.length} bytes")
    
    file="" 
    fileContent=""       
    blob_service = BlockBlobService(account_name="rithwikstor",account_key="EJ7xCyq2+AStqiar7Q==")
    containername="samples-workitems"
    generator = blob_service.list_blobs(container_name=containername) 
    for blob in generator:
        file=blob_service.get_blob_to_text(containername,blob.name)
        logging.info(blob.name)
        logging.info(file.content)
        fileContent+=blob.name+'\n'+file.content+'\n\n'

function.json:

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "myblob",
      "type": "blobTrigger",
      "direction": "in",
      "path": "samples-workitems/{name}",
      "connection": "rithwikstor_STORAGE"
    }
  ]
}

local.settings.json:

{
  "IsEncrypted": false,
  "Values": {
    "AzureWebJobsStorage": "Connection String Of storage account",
    "FUNCTIONS_WORKER_RUNTIME": "python",
    "rithwikstor_STORAGE": "Connection String Of storage account"
  }
}

Host.json:

{
  "version": "2.0",
  "logging": {
    "applicationInsights": {
      "samplingSettings": {
        "isEnabled": true,
        "excludedTypes": "Request"
      }
    }
  },
  "extensionBundle": {
    "id": "Microsoft.Azure.Functions.ExtensionBundle",
    "version": "[3.*, 4.0.0)"
  },
  "concurrency": {
    "dynamicConcurrencyEnabled": true,
    "snapshotPersistenceEnabled": true
  }
}

Then i added blobs as below:

enter image description here

Output:

enter image description here enter image description here

Please try to file the above process and code, you will get correct output as I have got.

RithwikBojja
  • 5,069
  • 2
  • 3
  • 7