Is there a way to optimize the insertion of a lot of data into an empty CockroachDB table?
Asked
Active
Viewed 1,578 times
1 Answers
6
To optimize inserting data into CockroachDB tables, there are a few pieces of guidance:
- Create the table without any secondary indexes, insert your data, and then add any secondary indexes you want.
- Insert 500 rows per
INSERT
statement. That number might vary a bit depending on the size of your rows, but is a good guideline to optimize the speed at which you can write data. - Use the
IMPORT
statement to bulk import CSV files into a single table. This is the fastest way to get data into CockroachDB.
If you're moving from PostgreSQL to CockroachDB, you can also use pg_dump
to create a COPY
statement, which CockroachDB is optimized to ingest. It's a slightly more involved process, but you can find the details about how to do it in CockroachDB's import documentation.

Alex Robinson
- 12,633
- 2
- 38
- 55
-
Using batches of 1000 rows, still seeing about 1-2 transaction per second. I do have secondary indexes, but is this the expected speed? (3 nodes) – remram Nov 02 '22 at 19:06