3

I wanted to know how we can run stored procedure in spark pool (azure synapse) which i have created in dedicated SQL pool. Also can we run SQL queries to access data in ddsql pool in notebook.

CHEEKATLAPRADEEP
  • 12,191
  • 1
  • 19
  • 42
darkstar
  • 39
  • 6

1 Answers1

1

It is possible to do this (eg using an ODBC connection as described here) but you would be better off just using a Synapse Pipeline to do the orchestration:

  1. run a stored Proc activity which places the data you want to work with in a relevant table or storage account
  2. call a notebook activity using the spark.read.synapsesql method as described in detail here.

The pattern:

enter image description here

Is there a particular reason you are copying existing data from the sql pool into Spark? I do a very similar pattern but reserve it for things I can't already do in SQL, such as sophisticated transform, RegEx, hard maths, complex string manipulation etc

wBob
  • 13,710
  • 3
  • 20
  • 37
  • I am not copying data into spark again. I am trying to run my stored procedure created in ddsql pool in spark notebook. I didn't get the point 2 which u mentioned. – darkstar Jun 16 '22 at 13:03
  • Don’t use Spark to run stored procedures, use the Stored Proc activity in a Synapse Pipeline. Chain it together with a Notebook activity if you need to. – wBob Jun 16 '22 at 13:36