0

I need to migrate ~ 2.5m rows from MySQL to Firebase and looking for effective way to do this using NodeJS.

In the MySQL database some of the columns are of BLOB type so I need to convert them to text. I am using CONVERT function, so sample query would be sth like this:

SELECT 
    FLOOR((id >> 8) / 1000) AS 'createdAt',
    CONVERT( log_1 USING utf8mb4) AS 'log_1',
    CONVERT( log_2) AS 'log_2',
    m.direction as 'direction'
FROM
    table;

And here is my first question, is there a way to get data in some batches in a few iterations instead of taking all 2.5m rows in one transaction? Something like SKIP TAKE in MSSQL?

Having all data from MySQL I need to transform them to some structure, group by some property and eventually store them in Firebase. What is the most effective way to do it? I guess having just a for..loop and saving the doc by doc with just calling set function wont be effective.

await fb.collection('test').doc().set(chatCollection[i])
camel
  • 1,233
  • 2
  • 12
  • 18

1 Answers1

0

I'd just do the SQL query which you need and export all the data into a file (say csv). This way you'll have less things to care about if they break.

Then read the file in batches and do a batch insert into firebase.

https://firebase.google.com/docs/firestore/manage-data/transactions

If you have files in SQL store them into a blob storage and have a link from your firebase documents.

2.5 milion records is nothing huge so this should be pretty fast to do :). Good luck!

Dan Dinu
  • 32,492
  • 24
  • 78
  • 114