I wanted to know how we can run stored procedure in spark pool (azure synapse) which i have created in dedicated SQL pool. Also can we run SQL queries to access data in ddsql pool in notebook.
Asked
Active
Viewed 940 times
1 Answers
1
It is possible to do this (eg using an ODBC connection as described here) but you would be better off just using a Synapse Pipeline to do the orchestration:
- run a stored Proc activity which places the data you want to work with in a relevant table or storage account
- call a notebook activity using the
spark.read.synapsesql
method as described in detail here.
The pattern:
Is there a particular reason you are copying existing data from the sql pool into Spark? I do a very similar pattern but reserve it for things I can't already do in SQL, such as sophisticated transform, RegEx, hard maths, complex string manipulation etc

wBob
- 13,710
- 3
- 20
- 37
-
I am not copying data into spark again. I am trying to run my stored procedure created in ddsql pool in spark notebook. I didn't get the point 2 which u mentioned. – darkstar Jun 16 '22 at 13:03
-
Don’t use Spark to run stored procedures, use the Stored Proc activity in a Synapse Pipeline. Chain it together with a Notebook activity if you need to. – wBob Jun 16 '22 at 13:36