1

I have an Azure Data Factory Pipeline that runs on a Blob Created Trigger, I want it to grab the last Blob added and copy that to the desired location.

How do I dynamically generate the file path for this outcome?

ratchet
  • 195
  • 4
  • 15

1 Answers1

2

"@triggerBody().folderPath" and "@triggerBody().fileName" captures the last created blob file path in event trigger. You need to map your pipeline parameter to these two trigger properties. Please follow this link to do the parameter passing and reference. Thanks.

Wang Zhang
  • 317
  • 1
  • 3
  • 1
    Activity Applicant Blob to Applicant Table failed: 'The template language expression 'triggerBody().fileName' cannot be evaluated because property 'fileName' doesn't exist, available properties are 'DataFactory, Pipeline, RunId, RunToken, TriggerType, TriggerId, TriggerName, TriggerTime, TriggerCallbackUri, RunAttempt, parameters'. – ratchet Sep 17 '18 at 17:20
  • ^ I have the Trigger watching the correct Blob path, path starting with and ending with appropriate "data/{@triggerBody().fileName}.csv" < not actually what the trigger setup looks like but is representative... and @triggerBody().fileName is not available? – ratchet Sep 17 '18 at 17:21
  • 1
    Hi, please don't directly refer "@triggerBody().fileName" or "@triggerBody().folderPath" in dataset. You should: 1. define dataset parameter and refer it to the dataset fileName/folderPath, 2.Define pipeline parameters and pass them to the dataset parameter, 3.Pass "@triggerBody().folderPath" and "@triggerBody().fileName" to pipeline parameters when trigger the pipeline run. This link: https://learn.microsoft.com/en-us/azure/data-factory/how-to-create-event-trigger#map-trigger-properties-to-pipeline-parameters gives you more details. – Wang Zhang Sep 18 '18 at 02:16