This question is similar to:
Setting up MySQL and importing dump within Dockerfile
But the answer to that question did not address my use case.
I have a MySQL database that has 5TB of data in production. For development, I only need about 500MB of that data. The integration tests that run as part of the build of my application require access to a MySQL database. Currently, that database is being created on Jenkins and data is being injected into it by the build process. This is very slow.
I would like to replace this part of this process with Docker. My idea is that I would have a Docker container that runs MySQL and that has my 500MB of data already baked into the container, rather than relying on the standard process associated with the MySQL Docker image of only executing the MySQL import when the container launches. Based on tests to date, the standard process is taking 4 to 5 minutes, where as I would like to get this down to seconds.
I would have thought this would be a common use case, but pre-baking data in MySQL Docker containers seems to be frowned upon, and there isn't really any guidance regarding this method.
Has anyone any experience in this regard? Is there a very good reason why data should not be pre-baked into a MySQL Docker container?