18

I am seeing bitbucket pipeline error - container 'docker' exceeded memory limit while running bitbucket pipeline. I tried to use all possible service memory limits as per below documentation, but issue was not resolved.

Databases and service containers - Service memory limits

Can you help resolve the issue?

Audwin Oyong
  • 2,247
  • 3
  • 15
  • 32
ElasticSearchUser
  • 191
  • 1
  • 1
  • 4
  • Has the same issue, but randomly.. so annoying. Seriously, what does atlassian think?? We're using a cloud build server to the eliminate the limitations of hardware.. – Küzdi Máté Apr 25 '20 at 18:53
  • hi, I optimised the docker file which is used by Pipeline, that resolved the memory issue. – ElasticSearchUser May 05 '20 at 22:21

4 Answers4

35

I contacted bitbucket, and they provided a solution:

  • at the beginning of the pipeline (before pipelines:)

options:
  docker: true
  size: 2x

  • at every large step:

name: XXXX
image: google/cloud-sdk:latest
services:
  - docker
size: 2x

  • at the end of the pipeline:

definitions:
  services:
    docker:
      memory: 4096

Küzdi Máté
  • 661
  • 8
  • 12
  • 3
    Why would I need to add options|docker as well as a services|docker? I though options|docker was just a shorthand to provide a docker service for each step? – Markus Rohlof Dec 07 '21 at 12:55
14

It is due to your build take more memory than allocated In order to resolve this you need add this in your bitbucket-pipelines.yml

image: .....
options:      <= Add this
  docker: true <= Add this
  size: 2x   <= Add this
pipelines:
  branches:
   branches:
    master:
      - step:
          caches:
            - ....
          services: <= Add this
            - docker <= Add this
definitions: <= Add this
  services: <= Add this
    docker: <= Add this
      memory: 4096 <= Add this
Akshay Sharma
  • 169
  • 2
  • 7
  • 3
    Why would I need to add options|docker as well as a services|docker? I though options|docker was just a shorthand to provide a docker service for each step? – Markus Rohlof Dec 07 '21 at 12:55
5

As said previously you can use size: 2x on a step to increase the memory limit for that step or set it in options which will enable 2x size for all steps automatically.

However, it is worth noting that doing so will consume twice the number of build minutes compared to a regular step, effectively costing twice as much, as described here

smac89
  • 39,374
  • 15
  • 132
  • 179
Aman Sanghvi
  • 81
  • 1
  • 3
1

Updated pipeline yaml to this and its worked:

For every large step

  - step:
      name: 'Bitbucket pipeline test'
      services:
        - docker
      size: 2x

At the end of pipeline

definitions:
  services:
    docker:
      memory: 4096   # as per your requirement
alvahab
  • 21
  • 3