4

I'm trying to insert a big array of data in MySQL database with node.js. My code works correctly with small array about 300 elements, but when I insert an array with 1M elements I have the following error:

Error: read ECONNRESET
    at TCP.onStreamRead (node:internal/stream_base_commons:211:20)
    at Socket.emit (node:events:379:20)
    at addChunk (node:internal/streams/readable:313:12) {
  errno: -4077,
  code: 'ECONNRESET',
  syscall: 'read',
  fatal: true
}

My query:

let query = "INSERT INTO transactions (customer_id, tr_date, tr_time, mcc_code, tr_type, amount, term_id) VALUES ?";
              connection.query(query, [csvData], (error, response) => {
                console.log(error || response);
              })
grishabobr
  • 65
  • 1
  • 2
  • 6

2 Answers2

1

See here more about the error itself. It's simply a connection error.

Why the connection fails is most likely because that insert is huge and is probably taking a long time to terminate. Also because the data being sent over the network is decent sized, the time the whole operation takes to complete is increased.

The solution to your problem is, IMO, to split the data into multiple smaller inserts (try 500, 1k, 5k, depending on the size of each row).

Diogo Simões
  • 187
  • 1
  • 7
-2

Try to return to node v14. It can solve the problem.

  • 1
    Downgrading node to an older version can hardly be recommended as a good solution, especially with node 14 maintenance mode ending in just north of 1 year from now (2022 02) – Simas Joneliunas Feb 20 '22 at 03:09