0

I'm trying to read multiple files of same type from a container recursively from Azure blob storage in Python with Function App. But how could that be done using binding functions in host.json of orchestrator as shown below? What appropriate changes should be made in local settings as I've mentioned the conn strings and paths to blobs already in the same?

{
  "scriptFile": "__init__.py",
  "bindings": [
    {
      "name": "context",
      "type": "orchestrationTrigger",
      "direction": "in"
    },
    {
      "name": "inputblob",
      "type": "blob",
      "dataType": "string",
      "path": "test/{file_name}.pdf{queueTrigger}",
      "connection": "CONTAINER_CONN_STR",
      "direction": "in"
    }
  ]
}

*test : The directory I have.

CONTAINER_CONN_STR : already specified path

Also, when doing so, in normal method without binding, gives error while downloading the files to local system as given below:

Exception: PermissionError: [Errno 13] Permission denied: 'analytics_durable_activity/'

Stack: File "C:\Program Files\Microsoft\Azure Functions Core Tools\workers\python\3.8\WINDOWS\X64\azure_functions_worker\dispatcher.py", line 271, in _handle__function_load_request

func = loader.load_function(

Kartika
  • 63
  • 8

2 Answers2

0

how could that be done using binding functions in host.json of orchestrator as shown below? What appropriate changes should be made in local settings

The configuration that you have used looks good. For more information, you can refer to this Example.

Also, when doing so, in normal method without binding, gives error when downloading the files to local system as given below:

You might this error when you are trying to open a file, but your path is a folder or if you don't have the permissions that are required.

You can refer to this SO thread which discusses a similar issue.

REFERENCES: Set, View, Change, or Remove Permissions on Files and Folders | Microsoft Docs

SwethaKandikonda
  • 7,513
  • 2
  • 4
  • 18
  • I agree, I used to get this error when opening files, but here I wasn't trying that, it was a path, that's why I asked. – Kartika Sep 23 '21 at 05:36
0

You can keep the state of the trigger in an entity and checking the same every time the function gets triggered. The function will process the file only when the state matches i.e. the previous file has already been received but not processed.

Please refer https://learn.microsoft.com/en-us/azure/azure-functions/durable/durable-functions-overview?tabs=csharp - Pattern #6: Aggregator (stateful entities)