1

My company is using bitbucket-pipeline alongside with docker to build an image of the application before deploying it.

When I'm reading into the docker file, I can see these lines:

(ignored a bunch of lines)
RUN yarn install
RUN yarn run my-script
COPY . .

and in pipeline, I see this script

docker build .

My question is, is there any reason why I should put RUN yarn install and RUN yarn run my-script into docker and not in pipeline?

My reasons behind putting it in pipeline is so you can utilise cache from bitbucket pipeline, as well as lower the size of the docker image (I understand that few layers means smaller image size).

Tree Nguyen
  • 1,198
  • 1
  • 14
  • 37
  • It depends. Is the very docker image deployed to a docker orchestrator and it is what serves the application? Or it is just used to build a distribution of a site which is then published to some kind of CDN in your deployment workflow but the docker image is discarded? – N1ngu Jul 14 '22 at 12:03
  • If your concern is about taking advantage of bitbucket pipelines caches, know that using external volumes during the build phase is a complex and evolving topic. See https://stackoverflow.com/q/26050899/11715259 – N1ngu Jul 14 '22 at 12:07
  • Actually, I think this might be a duplicate for https://stackoverflow.com/q/71564600/11715259 depending on the answers to my questions. – N1ngu Jul 14 '22 at 12:12

1 Answers1

2

I have seen both setup in various projects. yarn install within the docker environment would still be my preferred choice because of the separation of domain it provide. Imaging having a team of many people - setting up YARN install within docker developers don't have to worry about npm environment on their own machines. Ideally if you are utilising docker then local machine should not have anything but docker. Same goes for PIPELINE - if you ever decide to move away from bitbucket-pipeline - having everything contained within the docker environment you can just pack up and go.

William Tran
  • 698
  • 4
  • 7