0

I have table with 17 columns and I have to insert rows from a particular file into this table.

Number of rows in that file are ~ 4000.

I have approached ExecuteMany(qry) method to insert multiple rows, but I have final columns data as:

rows = [('abc','x',....),(....),..]  # 4k tuples in this list

Now I have problem in passing this tuples at a time in the final query

Can anybody help me in passing this values to the this final query?

sql = "INSERT INTO [dbo].["+tablename+"] VALUES {}".format(rows)
cursor.executemany(sql)

Update: SQL Server 2008 throws error for insertion of more than 1k rows at a time

So how to resolve this issue with any other way?

marc_s
  • 732,580
  • 175
  • 1,330
  • 1,459
  • Why not query the file directly rather than trying to build an array with dynamic SQL? – Jacob H Jul 18 '17 at 14:52
  • Note that sql server will accept maximum 1000 values list above that it will throw error.. – Pரதீப் Jul 18 '17 at 14:56
  • @JacobH Bulk insert is not my approach as I can get random files and I need to generate those columns and table w.r.t file names. so I used this method. –  Jul 18 '17 at 14:57
  • @Pரதீப் yes that's my issue. so how can I resolve this –  Jul 18 '17 at 14:57
  • Either by doing in batches of 1000 rows (hopefully you have a unique id in the file) or else use a method that isn't so clunky :/ – Jacob H Jul 18 '17 at 14:58
  • You can use a datatable and fill it from the file and then use SqlBulkCopy.WriteToServer to write it in one go to the database – Mono Jul 18 '17 at 15:04
  • You can use Bulk insert please refer to this post https://stackoverflow.com/questions/15242757/import-csv-file-into-sql-server – Amr Jul 18 '17 at 15:11

1 Answers1

0

You should use the XML data type MSSql Server 2008 to send all the data at the same time, the XML nodes can be used with attributes or node values as you need it

Gilberto
  • 77
  • 7