1

Is it possible to use Azure Synapse Pipeline to execute a set of files as SQL scripts ?

I have a set of files each of which contains the SQL to create say a stored procedure in my Azure Synapse Dedicated SQL pool. I have saved these SQL files in an Azure Datalake container.

How can I execute each of these SQL files ?

I created an Azure Synapse Pipeline :

GetMetadata which returns each filename in datalake container

output from above is sent into a ForEach which gains access to each of the the filenames

Now to execute each of these files the ForEach calls a Script activity however Azure Synapse pipeline web UI does not make it apparent how or even if its possible to use this process flow to execute files.

When I run the pipeline the ForEach has Setting -> Items -> @activity('GetMetadata01').output.childItems

Script activity -> linked service points to my Synapse Script -> NonQuery -> @item()

I get this error message per invocation of the Script activity :

"message": "Failed to convert the value in 'text' property to 'System.String' type. Please make sure the payload structure and value are correct.",

Any advice ?

Scott Stensland
  • 26,870
  • 12
  • 93
  • 104

1 Answers1

0

I don't think it can be done with simple text files in synapse pipeline. However, I thought of 2 workarounds:

  1. Use Json files to host your SQL objects, and read these Jsons as the source of script activity
  2. Use the DMV sys.sql_modules (or other DMV, per your needs) to get the code you want to run into a lookup activity, and then use it as the source for the script activity.
Chen Hirsh
  • 736
  • 1
  • 1
  • 13