9

I am trying to import data from MySQL dump .sql file to get imported into MongoDB. But I could not see any mechanism for RDBMS to NoSQL data migration.
I have tried to convert the data into JSON and CSV but it is not giving m the desired output in the MongoDB.
I thought to try Apache Sqoop but it is mostly for SQL or NoSQL to Hadoop.
I could not understand, how it can be possible to migrate data from 'MySQL' to 'MongoDB'?
I there any thought apart from what I have tried till now?
Hoping to hear a better and faster solution for this type of migration.

Jaffer Wilson
  • 7,029
  • 10
  • 62
  • 139

2 Answers2

7

I suggest you dump Mysql data to a CSV file,also you can try other file format,but make sure the file format is friendly so that you can import the data into MongoDB easily,both of MongoDB and Mysql support CSV file format very well.

You can try to use mysqldump or OUTFILE keyword to dump Mysql databases for backup,using mysqldump maybe takes a long time,so have a look at How can I optimize a mysqldump of a large database?.

Then use mongoimport tool to import data.

As far as I know,there are three ways to optimize this importing:

  • mongoimport --numInsertionWorkers N It will start several insertion workers, N can be the number of cores.

  • mongod --njournal Most of the continuous disk usage come from the journal,so disable journal might be a good way for optimizing.

  • split up your file and start parallel jobs.

Actually in my opinion, importing data and exporting data aren't difficulty,it seems that your dataset is large,so if you don't design you document structure,it still make your code slower,it is not recommended doing automatic migrations from relational database to MongoDB,the database performance might not be good.

So it's worth designing your data structure, you can check out Data models.

Hope this helps.

Community
  • 1
  • 1
McGrady
  • 10,869
  • 13
  • 47
  • 69
  • Not workable solution. my data is complex and cannot be exported to csv. I tried it many times. – Jaffer Wilson Feb 08 '17 at 08:33
  • 1
    @JafferWilson What does `cannot be exported to csv` mean?Please point out that. – McGrady Feb 08 '17 at 09:30
  • Actually data is something like this `data:[app:[{key1:value,key2:value}]]` so now you might have understood the problem with csv. it do not handle complex data like that I have. So json was better but not acceptable by MongoDB – Jaffer Wilson Feb 08 '17 at 09:34
  • @JafferWilson It seems that you should dump database data to Json,MongoDB accepts json,have a look at [jsonArray](https://docs.mongodb.com/manual/reference/program/mongoimport/#cmdoption--jsonArray),make sure the size of each document in collection is less than 16MB. – McGrady Feb 08 '17 at 10:16
  • Thank you for your advise. But I have tried everything before writing this question here. But never mind, I hope there is no solution apart from these things. – Jaffer Wilson Feb 08 '17 at 10:19
2

You can use Mongify which helps you to move/migrate data from SQL based systems to MongoDB. Supports MySQL, PostgreSQL, SQLite, Oracle, SQLServer, and DB2.

Requires ruby and rubygems as prerequisites. Refer this documentation to install and configure mongify.

franklinsijo
  • 17,784
  • 4
  • 45
  • 63
  • When it comes to `ruby`I think it will be damn slow in execution. Can you tell me how much time it takes to import GBs of data? – Jaffer Wilson Feb 08 '17 at 06:07