1

I'm trying to load roughly 10,000 entries into a SQL Server table, and have found it doesn't allow loading over 1,000 row values. The error is shown below:

The number of row value expressions in the INSERT statement exceeds the maximum allowed number of 1000 row values.

Currently I'm trying to run a simple SQL script inside Data Studio.

INSERT INTO [<table>] (<COLUMNS>)
VALUES (<rows>)

I was wondering if there was any way to bypass this error within SQL Server, or if I need a separate script which then loads the data onto the server.

Tim
  • 5,435
  • 7
  • 42
  • 62
Ryan Garvey
  • 46
  • 12
  • 2
    It depends how you are inserting the records. Are you using a utility, SQL Script, EntityFramework, ADO.NET? A bit more context of what you are trying to achieve will help people answer the question. – Dawood Awan Jan 21 '22 at 10:26
  • 5
    1000s of rows is very little data if you use the proper bulk import tools, like BULK INSERT or `bcp`. In a .NET client application you can use SqlBulkCopy. This will allow you to load millions of rows in a stream – Panagiotis Kanavos Jan 21 '22 at 10:33
  • @marc_s You actually don't need to split the `INSERT` statement, you can use the following form with more than 1000 rows: `INSERT .... SELECT ... FROM (VALUES (....), (...), .....) AS v(.....)`. But you are asking two different questions: how to get around this restriction, and how to do this fast (probably with `BULK INSERT` or bcp or SqlBulkCopy instead) – Charlieface Jan 21 '22 at 10:49
  • Does this answer your question? [SQL speed up performance of insert?](https://stackoverflow.com/questions/7090243/sql-speed-up-performance-of-insert) – Charlieface Jan 22 '22 at 22:09

0 Answers0