0

I am importing a large application to Docker. Part of the application is a database. I have dumped the database into .sql file and now I am trying to import it to the docker container running official mysql image by mounting a directory from host machine and issuing command

mysql -u myUsername -p myDB < /mountdir/databasedump.sql

The database dump is very large, more than 10GB. Everything goes well for an hour, but then it issues error

loop: Write error at byte offset

I have a feeling that the size of the container runs out.

Is there a smarter way in accomplishing the dockerization of the database? In case not, how can I import the enormous database to the container?

2 Answers2

0

Don't use a devicemapper loop file for anything serious! Docker has warnings about this.

For a database that size you may want try mounting a host directory as a volume or creating a local volume to avoid docker's file system overhead. The two are essentially the same thing underneath.

If you are on RHEL, you can use an adequately sized LVM thin pool directly. There's a process in this answer for changing storage drivers.

Community
  • 1
  • 1
Matt
  • 68,711
  • 7
  • 155
  • 158
0

The problem was that any container couldn't store the whole imported database, which was 26Gb and the error was due to the container ran out of disk space.

I solved the problem by mounting a directory from host as a external volume using the -v switch and editing the MySQL config to store its databases there.

This solution of course 'un-virtualizes' the database and it might be a security risk. Database server runs well virtualized though. In my situation the slightly weakened security wasn't an issue.

  • if you want the whole database inside the conatiner you just have to modify the docker daemon startup parameter and add the following`--storage-opt dm.basesize=G` the default is 10 GB – Dimitrie Mititelu Jun 26 '16 at 11:03