0

I use sp_execute_external_script function to read and insert data into sql server and everything works, but I don't know how to use it to insert data using chunks.

the script that loads the data:

declare @scr nvarchar(max) = 
N'
import pandas as pd
OutputDataSet = pd.read_csv(r"C:\panda\test3.zip", names = ["PersonID", "FullName", "PreferredName", "SearchName", "IsPermittedToLogon", "Age"], header = 0, compression = "zip")
'
INSERT INTO [PANDA].[dbo].[tbl_sample_csv]
EXEC sp_execute_external_script
@language = N'Python',
@script = @scr

What I would like to do is load the printed data in chunks to [PANDA].[dbo].[tbl_sample_csv] to reduce the ram consumption on the server:

declare @scr nvarchar(max) = 
N'
import pandas as pd
 
for chunks in pd.read_csv(r"C:\panda\test3.zip", names = ["PersonID", "FullName", "PreferredName", "SearchName", "IsPermittedToLogon", "Age"], header = 0, compression = "zip", chunksize=500000):
    print(chunks)
 '
 
EXEC sp_execute_external_script
@language = N'Python',
@script = @scr

Has anyone had a similar problem and can give me some tips? I tried with @input_data_1 but I don't know how to pass sql object to python

LordDevPath
  • 255
  • 2
  • 6
  • Does this answer your question? [Trying to insert pandas dataframe to temporary table](https://stackoverflow.com/questions/57615642/trying-to-insert-pandas-dataframe-to-temporary-table) – nbk Nov 25 '22 at 23:00
  • not really, the script is called by the database server, so I wouldn't want to create an db engine – LordDevPath Nov 26 '22 at 09:48
  • Ido t under stand you the python script created a JSON and sends it to the database by that's is mor we efficien es TT HH as n in setting it row by row, so read TT HH we be answe RR carefully again I can't see where you have to create an engine – nbk Nov 26 '22 at 11:58

0 Answers0