1

I'm new to Google Cloud and Docker and I can't for the life of me figure out how to copy directories from the Docker container (pushed to the Container Registry) to the Google Compute Engine instance. I think I need to mount the volume but I don't really know how. In the docker container the main directory is /app which has my files. Basically I want to do this to see the docker container's files in Google Cloud.

I assumed that if i did: docker pull [HOSTNAME]/[PROJECT-ID]/[IMAGE]:[TAG] inside the cloud shell that the files would show up somewhere in the cloud shell i.e. in var/lib/docker but when I cd to var/lib/docker and type in: ls I get

ls: cannot open directory '.': Permission denied

Just to add I've tried following the "Connecting to Cloud Storage buckets" tutorial https://cloud.google.com/compute/docs/disks/gcs-buckets But realised that this is for single files. Is it possible to copy over the whole root directory of the Docker image using gsutil? Do I need to use something else instead, like persistent disks?

kenshima
  • 401
  • 4
  • 12
  • `docker cp CONTAINER:SRC_PATH DEST_PATH` https://docs.docker.com/engine/reference/commandline/cp/ – John Hanley Sep 30 '20 at 01:59
  • Perfect answer of @JohnHanley!! You need to have docker installed on your Compute Engine. Use COS boot image on your compute engine, it's easier! – guillaume blaquiere Sep 30 '20 at 08:08
  • Thank you @John Hanley will try that. Also do you by any chance know where to store the keyfile to be used to authenticate Docker? It seems a bit strange to store it in the Docker container itself but I don't know how else to do it. – kenshima Oct 01 '20 at 23:38
  • @guillaume blaquiere thank you. I am using a COS image right now. – kenshima Oct 01 '20 at 23:40
  • What do you mean by `authenticate Docker`? Do you mean authenticate to Google Container Registry or something else? If you mean to provide a service account to the container, typically you would use Docker Volumes https://docs.docker.com/storage/volumes/ For Compute Engine COS look at `--container-mount-host-path` https://cloud.google.com/sdk/gcloud/reference/compute/instances/create-with-container#--container-mount-host-path AND set the environment variable `GOOGLE_APPLICATION_CREDENTIALS` when launching the instance using the `--container-env=` option. – John Hanley Oct 01 '20 at 23:59

1 Answers1

1

You need to have docker installed in order to run your images and of course be able to copy anything from inside the image to your host filesystem.

Use docker cp CONTAINER:SRC_PATH DEST_PATH to copy files.

Have a look at the official Docker Documentation on how to use this command. Simillar topic was also discussed here on StackOverflow and has a very good answer.

Wojtek_B
  • 4,245
  • 1
  • 7
  • 21
  • Thank you for this. I'm just wondering how I'd copy over the whole Docker filesystem? Is that something that could even be done on a GCE? The reason why I want to do that is so that I can have the Docker directories be accessible from the cloud shell. Also when setting up authentication where would I store the Keyfile to be used in the Google Compute Engine? I think its a bit counter intuitive to store it in the Docker container itself. – kenshima Oct 01 '20 at 23:41
  • You can copy almost everything but it depends on what you're running in a container. When I ran `docker cp container:/ .` I got an error but was able to copy most files with directory structure. Only those that are in use may not get copied or will be outdated/corrupted. – Wojtek_B Oct 02 '20 at 08:20