2

I have an Azure Function that triggers from a directory (namespace) nested within an ADLS Gen 2 storage container.

Example: ADLS_AccountName/topLevelContainer/Directory1/Directory2/{name}

Unfortunately, the Function requires the Connection string of the entire DataLake as an input binding (stored as an app setting in local.settings.json).

This is far too much permission for a Function to have.

How do I generate a SAS token for a given nested namespace and use that as the app setting for the Function?

sschmeck
  • 7,233
  • 4
  • 40
  • 67
ericOnline
  • 1,586
  • 1
  • 19
  • 54
  • Please see my comment below your proposed answer. Unfortunately, there is no way an ACL setting can be used as an Azure Function input binding. Also, granting ACL perms on a *nested* directory grants permissions on all *parent* directories of the nest. This doesn't work for my use case. I imagine there are a lot of people who need this functionality (they just don't know it yet). Could be very powerful if implemented. – ericOnline Aug 28 '20 at 17:19

3 Answers3

2

The answer marked as correct is no longer accurate. Please look at https://learn.microsoft.com/en-us/dotnet/api/azure.storage.sas.datalakesasbuilder?view=azure-dotnet Starting service version 2020-02-10 it is possible to generate SAS token for a directory in an ADLS Gen2 account.

Ashish
  • 46
  • 2
  • Do you have a reference for the Python SDK equivalent? – ericOnline Apr 28 '21 at 14:09
  • @ericOnline, please check [generate_directory_sas](https://learn.microsoft.com/en-us/python/api/azure-storage-file-datalake/azure.storage.filedatalake?view=azure-python#azure-storage-filedatalake-generate-directory-sas). – sschmeck Mar 29 '22 at 09:05
1

Unfortunately it is not possible to create SAS token for a specific folder in ADLS Gen2 storage account. But you can leverage Access Control List to grant permission to a specific file or directory.

You can associate a security principal with an access level to your directories and files from your application. (Note: ACLs apply only to security principals in the same tenant)

If you are granting permissions by using only ACLs (no RBAC), then to grant a security principal read or write access to a folder, you'll need to give the security principal Execute permissions to the container, and to each folder in the hierarchy of folders that lead to the desired folder/file.

Here is an example gets and sets the ACL of a directory named my-directory. The string user::rwx,group::r-x,other::rw- gives the owning user read, write, and execute permissions, gives the owning group only read and execute permissions, and gives all others read and write permission.

public async Task ManageDirectoryACLs(DataLakeFileSystemClient fileSystemClient)
{
    DataLakeDirectoryClient directoryClient =
        fileSystemClient.GetDirectoryClient("my-directory");

    PathAccessControl directoryAccessControl =
        await directoryClient.GetAccessControlAsync();

    foreach (var item in directoryAccessControl.AccessControlList)
    {
        Console.WriteLine(item.ToString());
    }


    IList<PathAccessControlItem> accessControlList
        = PathAccessControlExtensions.ParseAccessControlList
        ("user::rwx,group::r-x,other::rw-");

    directoryClient.SetAccessControlList(accessControlList);

}

For more details, you could refer to this article.

Joey Cai
  • 18,968
  • 1
  • 20
  • 30
  • There are many moving parts! RBAC on the DataLake, ConnectionString/AccountKeys for the DataLake, ACL's for the directories, SAS tokens for top-level directory of the DataLake ("Container") and individual blobs. Unfortunately, I can't find a way to dial these in so that 1.) the Function is ONLY *monitoring* a single directory within the DataLake ("trigger binding") AND 2.) only has access to this single directory. The ACL's you mention *almost* solve #2, but because RWX on a parent directory gives access to ALL child directories, this doesn't work in my use case. Need a single directory scope. – ericOnline Aug 22 '20 at 00:21
  • Ended up using a separate, dedicated ADLS Gen2 Storage Account and keeping SAS tokens scoped to the Container level. – ericOnline Oct 09 '20 at 03:14
0

Directory scoped SAS tokens are supported since authentication version 2020-02-10, see Directory scoped shared access signatures (SAS) generally available.

The Python package azure-storage-file-datalake provides the function generate_directory_sas.

For TypeScript/JavaScript check the package @azure/storage-file-datalake and its function generateDataLakeSASQueryParameters with setting { isDirectory: true } of DataLakeSASSignatureValues.

generateDataLakeSASQueryParameters(
  {
    pathName: 'my/folder',
    permissions: DirectorySASPermissions.parse('r'),
    isDirectory: true,
    ...,
    version: '2020-02-10'
  }, ...);
sschmeck
  • 7,233
  • 4
  • 40
  • 67