0

Having generated a file in Data Factory (using a Copy Task where the source data set is a Sql SP, and the sink is Azure blob storage) I am trying to follow the answer to this question to copy the file into an AWS S3 bucket. I have the Azure function working fine, but it requires the URI-with-SAS-token of the file to transfer.

How do I get the SAS Token in the ADF pipeline, please? (Alternatively, if you can suggest a better way of copying a file from Blob Storage to S3, please do.)

GreatApe
  • 23
  • 7

1 Answers1

0

You can store the SAS token in a key vault and get the value from key vault via web activity. The below GIT location has the JSON code for the same: https://github.com/NandanHegde15/Azure-DataFactory-Generic-Pipelines/blob/main/Get%20Secret%20From%20KeyVault/Pipeline/GetSecretFromKeyVault.json

Nandan
  • 3,939
  • 2
  • 8
  • 21
  • Thank you - you've made me realise I'm making a conceptual error. The URI I can construct from the components I already have, but the SAS Token isn't a property of the file as such hence needing to get it from KeyVault. – GreatApe Nov 09 '21 at 16:40