0

Is it possible to execute the bulk insert query in SQL Server with a buffer from Python (df.to_csv(buffer) where buffer is a Pandas dataframe) instead of a file path:

bulk insert #temp 
from 'file_path'        -- instead of a file_path, use buffer?
with (  

        fieldterminator = ',',
        rowterminator = '\n'

      );

The only way I can think of doing this is to have Python make a CSV file and store it locally, and then have SQL execute a bulk insert via the file path. However, is there a way to do this without storing it locally, and rather directly in Python/SQL Server?

Dale K
  • 25,246
  • 15
  • 42
  • 71
pillow
  • 23
  • 4
  • no it has to be a file https://learn.microsoft.com/en-us/sql/t-sql/statements/bulk-insert-transact-sql?view=sql-server-ver16 – nbk Jun 22 '23 at 21:11
  • Send it as JSON. eg https://stackoverflow.com/questions/60745932/update-sql-server-database-using-stored-procedure-with-table-as-paramater-using/60746532#60746532 – David Browne - Microsoft Jun 22 '23 at 21:14
  • @DavidBrowne-Microsoft i think hwe will get the data faster by saving the csv and run a bulkinsert – nbk Jun 22 '23 at 22:00
  • At some size, probably. But there's not a simple pure-python way to do that. BULK INSERT requires the file to be visible to the SQL Server. So you would need to use BCP or similar. – David Browne - Microsoft Jun 22 '23 at 23:39

0 Answers0