-1

I need help caching packages in my bitbucket pipeline that were installed via apt-get.

For non-apt-get installed packages you can find the path where packages are installed online. However, I'm not sure what directorie(s) to cache for apt-get installed packages.

For example I have the following command in my pipeline script:

 apt-get update && apt-get install -y curl unzip git

I defined a cache directory in definitions like so:

caches: 
      apt-cache: /var/cache/apt

However, it's only caching 164 bytes and I don't think it's caching all of the packages that are actually installed.

Is there a way to find where these packages are installed so I can cache them?

Here is my full pipeline script below:

image: php:8.2-fpm

definitions:
  # set the paths for where the packages are installed that we are caching
  # these paths are used to download the packages from the cache to speed up deploys
  caches:
    install-php-extensions: /usr/local/bin/
    phpunit: web-app/vendor/bin/
    composer: /usr/local/bin
    # directory where apt package cache is
    apt-cache: /var/cache/apt
    php-extensions: /usr/lib/php/
    sonar: ~/.sonar
  steps:
    - step: &testing
        name: Test
        caches:
          - install-php-extensions
          - phpunit
          - composer
          - apt-cache
          - php-extensions
        services:
          - docker
        script:
          # Install apt packages
          - apt-get update && apt-get install -y curl unzip git

          # xdebug is needed to run the code coverage later on and to generate the code coverage report
          - pecl install xdebug-3.2.0 && echo "zend_extension=$(find /usr/local/lib/php/extensions/ -name xdebug.so)" > /usr/local/etc/php/conf.d/xdebug.ini

          # Install php extensions, set permissions to execute, required for snowflake, pdo, etc
          # The PDO installation is required later so the composer install doesn't fail with an undefined constant
          - curl -sSLf  -o /usr/local/bin/install-php-extensions https://github.com/mlocati/docker-php-extension-installer/releases/download/1.5.49/install-php-extensions
          - chmod +x /usr/local/bin/install-php-extensions
          - install-php-extensions bcmath odbc pdo_odbc soap

          # Install phpunit dependencies and run the phpunit tests with code coverage
          - curl -sS https://getcomposer.org/installer | php -- --install-dir=/usr/local/bin --filename=composer
          - cd web-app
          - composer require phpunit/phpunit --dev
          - XDEBUG_MODE=coverage vendor/bin/phpunit --testdox -d memory_limit=-1 --log-junit test-results/test-execution-results.xml --cache-result  --coverage-cache=./coverage/cache --coverage-clover=phpunit-coverage.xml tests/Unit
        artifacts:
          - test-results/test-execution-results.xml
    - step: &sonarqube
        name: Sonarqube coverage report
        caches:
          - sonar
        script:
          - cd web-app
          - pipe: sonarsource/sonarqube-scan:1.0.0
            variables:
              SONAR_HOST_URL: ${SONAR_HOST_URL}   # Get the value from the repository/workspace variable.
              SONAR_TOKEN: ${SONAR_TOKEN}
              DEBUG: "true"
    - step: &deploy
        name: Deploy
        caches:
          - docker
          - apt-cache
        services:
          - docker
        script:
          # Install apt packages
          - apt-get update && apt-get install -y unzip awscli
          - TAG=${BITBUCKET_COMMIT}

          # Set aws credentials
          - aws configure set aws_access_key_id "${AWS_ACCESS_KEY}"
          - aws configure set aws_secret_access_key "${AWS_SECRET_KEY}"
          - aws configure set region "${AWS_REGION}"

          # Get credentials for laravel from secrets manager and
          # Write to .env file
          - aws secretsmanager get-secret-value --secret-id ${ENV_SECRET_ID} --query SecretString --output text >> .env

          # Write odbc snowflake definition for connecting to database
          - aws secretsmanager get-secret-value --secret-id ${SNOWFLAKE_SECRET_ID} --query SecretString --output text > ./docker/php/snowflake/odbc.ini

          # Authenticate bitbucket-deployment user
          - aws ecr get-login-password --region us-west-2 | docker login -u AWS --password-stdin ${AWS_ACCOUNT_ID}.dkr.ecr.us-west-2.amazonaws.com

          # Build/deploy nginx image
          - NGINX_IMAGE="${AWS_ACCOUNT_ID}.dkr.ecr.us-west-2.amazonaws.com/nova/nginx"
          - docker build -f Dockerfile-nginx -t $NGINX_IMAGE .

          # Push the :latest image
          - docker push $NGINX_IMAGE:latest

          # Tag and push the image with bitbucket commit
          - docker tag $NGINX_IMAGE $NGINX_IMAGE:${BITBUCKET_COMMIT}
          - docker push $NGINX_IMAGE:${BITBUCKET_COMMIT}

          # Build/deploy php image
          - PHP_IMAGE="${AWS_ACCOUNT_ID}.dkr.ecr.us-west-2.amazonaws.com/nova/php-app"
          - docker build -f Dockerfile-php -t $PHP_IMAGE .

          # Push the :latest image
          - docker push $PHP_IMAGE:latest

          # Tag and push the image with bitbucket commit
          - docker tag $PHP_IMAGE $PHP_IMAGE:${BITBUCKET_COMMIT}
          - docker push $PHP_IMAGE:${BITBUCKET_COMMIT}

          # Start ecs migration task
          - aws ecs run-task --cluster nova-api-cluster --launch-type FARGATE --network-configuration "awsvpcConfiguration={subnets=['${PUBLIC_SUBNET_A}','${PUBLIC_SUBNET_B}'],securityGroups=['${SECURITY_GROUP}'],assignPublicIp=ENABLED}" --task-definition nova-api-migration-task

          # Force new ecs task deployment
          - aws ecs update-service --cluster ${CLUSTER_NAME} --service ${SERVICE_NAME} --region ${AWS_REGION} --force-new-deployment
    - step: &auto_merge_down
        name: Auto Merge Down
        image: atlassian/default-image:3
        script:
          - ./autoMerge.sh stage || true
          - ./autoMerge.sh dev || true

pipelines:
  branches:
    dev:
      - step:
          <<: *testing
      - step:
          <<: *deploy
          deployment: Dev
    stage:
      - step:
          <<: *testing
      - step:
          <<: *deploy
          deployment: Staging
    prod:
      - step:
          <<: *testing
      - step:
          <<: *sonarqube
      - step:
          <<: *deploy
          deployment: Production
      - step:
          <<: *auto_merge_down
Drew Gallagher
  • 442
  • 1
  • 12

1 Answers1

-1

Found another answer on the community here https://community.atlassian.com/t5/Bitbucket-questions/Any-way-to-cache-apt-get-install-y-zip-in-bitbucket-pipelines/qaq-p/622876, thanks @Chase Han. 

Basically you use the following command in your pipeline script or a local docker image that matches the image you have in the pipeline.

which <package-name-here>

e.g

which git

Then it will output a path where it exists. 

e.g. 

/usr/bin/git

Then you just need to include the path in your cache definitions that contains that package. 

e.g.

caches:

      #/usr/bin located packages like git, curl, etc

      usr-bin: /usr/bin

And then you can use that cache definition in your steps

Drew Gallagher
  • 442
  • 1
  • 12
  • Sorry I'll quote myself: _Trying to cache final installed files is useless and close to nonsense. Generally, the binary placed by a package will rely on a bunch of libraries and files of the same and other packages that will have been spread on the OS folder tree. Also, those files being present will not speed-up update nor install instructions in any way._ See https://stackoverflow.com/a/72959639/11715259 – N1ngu Jan 16 '23 at 11:52
  • I have seen performance benefits from caching so I disagree with this – Drew Gallagher Jan 16 '23 at 16:21
  • False implies true: beware not to fool yourself with sloppy measurements. By what mechanism the presence of `/usr/bin/git` would speed up an `apt install git` instruction? What happens to the vast amount of files laid under `/usr/lib/git-core/*` by the git package? What happens to the 39 apt dependencies that must be pre-installed? – N1ngu Jan 17 '23 at 17:29
  • If git is already installed, it caches the install files so it doesn't have to fetch over the network and store files – Drew Gallagher Jan 17 '23 at 18:33
  • `/usr/bin/git` being present alone does not mean "git is installed". "Install files" can be listed with `dpkg --listfiles git`, you surely understand most of them are not cached by your answer. `apt-get update` and `apt-get install git` will fetch the network and perform a download of .deb artifacts irrespective of those files being present beforehand. All your premises are false, I don't want to be harsh but you should review whatever you think you know about computers. – N1ngu Jan 18 '23 at 10:01
  • Have you read this documentation https://support.atlassian.com/bitbucket-cloud/docs/cache-dependencies/ ?. This is what I am following and it says it speeds up the builds – Drew Gallagher Jan 18 '23 at 14:28
  • I am very aware of that documentation. If you do it right, it speeds up the builds, yes. You are doing it wrong, read the answer I linked. – N1ngu Jan 19 '23 at 09:22