Questions tagged [bulkinsert]

Act of inserting multiple rows into a database simultaneously.

Act of inserting multiple rows into a database simultaneously.

The purpose is to speed up loading of large amounts of data into a database. Depending on the database or database drivers used, the data is generally transferred and committed to the database in large groups of rows instead of one row at a time.

Also include appropriate database tags such as or , and access methods such as , , or .

2493 questions
350
votes
12 answers

What's the fastest way to do a bulk insert into Postgres?

I need to programmatically insert tens of millions of records into a Postgres database. Presently, I'm executing thousands of insert statements in a single query. Is there a better way to do this, some bulk insert statement I do not know about?
Ash
  • 24,276
  • 34
  • 107
  • 152
285
votes
7 answers

How to speed up insertion performance in PostgreSQL

I am testing Postgres insertion performance. I have a table with one column with number as its data type. There is an index on it as well. I filled the database up using this query: insert into aNumber (id) values (564),(43536),(34560) ... I…
Luke101
  • 63,072
  • 85
  • 231
  • 359
238
votes
14 answers

Import CSV file into SQL Server

I am looking for help to import a .csv file into SQL Server using BULK INSERT and I have few basic questions. Issues: The CSV file data may have , (comma) in between (Ex: description), so how can I make import handling these data? If the client…
Prabhat
  • 4,164
  • 4
  • 17
  • 12
212
votes
7 answers

How do I temporarily disable triggers in PostgreSQL?

I'm bulk loading data and can re-calculate all trigger modifications much more cheaply after the fact than on a row-by-row basis. How can I temporarily disable all triggers in PostgreSQL?
David Schmitt
  • 58,259
  • 26
  • 121
  • 165
194
votes
10 answers

mongodb: insert if not exists

Every day, I receive a stock of documents (an update). What I want to do is insert each item that does not already exist. I also want to keep track of the first time I inserted them, and the last time I saw them in an update. I don't want to have…
LeMiz
  • 5,554
  • 5
  • 28
  • 23
137
votes
14 answers

How to insert multiple rows from array using CodeIgniter framework?

I'm passing a large dataset into a MySQL table via PHP using insert commands and I'm wondering if it's possible to insert approximately 1000 rows at a time via a query other than appending each value on the end of a mile-long string and then…
toofarsideways
  • 3,956
  • 2
  • 31
  • 51
93
votes
9 answers

How can I Insert many rows into a MySQL table and return the new IDs?

Normally I can insert a row into a MySQL table and get the last_insert_id back. Now, though, I want to bulk insert many rows into the table and get back an array of IDs. Does anyone know how I can do this? There are some similar questions, but they…
Peacemoon
  • 3,198
  • 4
  • 32
  • 56
85
votes
4 answers

Performance of bcp/BULK INSERT vs. Table-Valued Parameters

I'm about to have to rewrite some rather old code using SQL Server's BULK INSERT command because the schema has changed, and it occurred to me that maybe I should think about switching to a stored procedure with a TVP instead, but I'm wondering what…
Aaronaught
  • 120,909
  • 25
  • 266
  • 342
82
votes
9 answers

BULK INSERT with identity (auto-increment) column

I am trying to add bulk data in database from CSV file. Employee table has a column ID (PK) auto-incremented. CREATE TABLE [dbo].[Employee]( [id] [int] IDENTITY(1,1) NOT NULL, [Name] [varchar](50) NULL, [Address] [varchar](50) NULL ) ON…
Abhi
  • 1,963
  • 7
  • 27
  • 33
73
votes
12 answers

Bulk Insert Partially Quoted CSV File in SQL Server

I'm trying to import a correctly quoted CSV file, meaning data is only quoted if it contains a comma, e.g.: 41, Terminator, Black 42, "Monsters, Inc.", Blue I observe that the first row imports correctly, but the second row errors in a manner that…
Eric J.
  • 147,927
  • 63
  • 340
  • 553
55
votes
8 answers

Writing large number of records (bulk insert) to Access in .NET/C#

What is the best way to perform bulk inserts into an MS Access database from .NET? Using ADO.NET, it is taking way over an hour to write out a large dataset. Note that my original post, before I "refactored" it, had both the question and answer in…
Marc Meketon
  • 2,463
  • 1
  • 24
  • 21
52
votes
7 answers

Accelerate bulk insert using Django's ORM?

I'm planning to upload a billion records taken from ~750 files (each ~250MB) to a db using django's ORM. Currently each file takes ~20min to process, and I was wondering if there's any way to accelerate this process. I've taken the following…
Jonathan Livni
  • 101,334
  • 104
  • 266
  • 359
50
votes
5 answers

Bulk Insertion on Android device

I want to bulk insert about 700 records into the Android database on my next upgrade. What's the most efficient way to do this? From various posts, I know that if I use Insert statements, I should wrap them in a transaction. There's also a post…
Ron Romero
  • 9,211
  • 8
  • 43
  • 64
47
votes
6 answers

How to speed up bulk insert to MS SQL Server using pyodbc

Below is my code that I'd like some help with. I am having to run it over 1,300,000 rows meaning it takes up to 40 minutes to insert ~300,000 rows. I figure bulk insert is the route to go to speed it up? Or is it because I'm iterating over the…
TangoAlee
  • 1,260
  • 2
  • 13
  • 34
40
votes
2 answers

Use binary COPY table FROM with psycopg2

I have tens of millions of rows to transfer from multidimensional array files into a PostgreSQL database. My tools are Python and psycopg2. The most efficient way to bulk instert data is using copy_from. However, my data are mostly 32-bit floating…
Mike T
  • 41,085
  • 18
  • 152
  • 203
1
2 3
99 100