130

I' trying to understand what eu.artifacts.%PROJECT NAME%.appspot.com is. It's currently taking up 800mb of storage from my daily 5gb limit. It contains only application/octet-stream type of files. This bucket was created automatically and the file path is eu.artifacts....appspot.com/containers/images. 2 heaviest files there weight as much as 200mb and 130mb. I tried deleting it but it was automatically created again. Users can upload pictures on my website but that bucket currently only takes about 10mb containing all the user images.

So my question is: What is this bucket for and why does it weight so much?

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
glaze
  • 1,309
  • 2
  • 6
  • 6
  • 1
    having same issues and starting to affect my billing, ideas any one? – Alex Tran Aug 31 '20 at 21:08
  • As Frank already pointed out this is a recent change in how your functions are stored. Here is a link for more info: https://firebase.google.com/support/faq#expandable-10 – glaze Sep 01 '20 at 13:26
  • 3
    The most useful explanation I found: https://cloud.google.com/functions/pricing?authuser=0#deployment_costs – Horațiu Udrea Dec 16 '20 at 18:13

11 Answers11

90

firebaser here

If you are using Cloud Functions, the files you're seeing are related to a recent change in how the runtime (for Node 10 and up) is built.

Cloud Functions now uses Cloud Build to create the runtime (for Node 10 and up) for your Cloud Functions. And Cloud Build in turn uses Container Registry to store those runtimes, which stores them in a new Cloud Storage bucket under your project.

For more on this, also see this entry in the Firebase pricing FAQ on Why will I need a billing account to use Node.js 10 or later for Cloud Functions for Firebase?

Also see this thread on the firebase-talk mailing list about these artifacts.


Update: some other answers suggest deleting artifacts from the Storage buckets, and even setting up lifecycle management on them to do so automatically. This leads to dangling references to those artifacts in the Container Registry, which breaks future builds.

To safely get rid of the artifacts, delete the container from the Container Registry console (it's under the gcf folder) or with a script. That will then in turn also delete the artifacts from your Storage bucket.

Since version 9.14 of the CLI, the firebase deploy process automatically cleans up its container images after a deploy. So if you upgrade to the latest version, you should no longer get additional artifacts in your storage buckets.

Frank van Puffelen
  • 565,676
  • 79
  • 828
  • 807
  • I am using cloud functions. So if I understood correctly the bucket is my cloud functions stored in firebase storage. – glaze Aug 25 '20 at 14:48
  • It is a regular Cloud Storage bucket, Firebase just allows you to interact with it through its SDKs and shows it in its console. – Frank van Puffelen Aug 25 '20 at 15:57
  • 136
    It is strange that firebase does not give us any control over this and simply increases the use of Storage, almost forcing us to pay without knowing it. – luke cross Sep 01 '20 at 11:06
  • 33
    The remaining question is: how to remove those obsolete artifacts ? No doc in Firebase relates to those. I have a project with 18 GB of storage used, because my team worked on Cloud functions lately. Not a good developer experience in my opinion. – Johan Chouquet Sep 21 '20 at 10:24
  • 6
    Good question. How do we remove the unused ones? – Razvan Cristian Lung Sep 21 '20 at 12:59
  • This seems promising: https://stackoverflow.com/questions/63884429/how-to-delete-outdated-firebase-cloud-function-containers-from-gc-storage – Frank van Puffelen Sep 21 '20 at 14:35
  • 9
    My project's artifacts files are using 500MB/day and I'm being billed 0,01$ when the free tier is up to 5GB. Can someone explain why this happens? I'm not using cloud storage for something else than these auto generated files. – Enmanuel Ramírez Sep 26 '20 at 18:30
  • 8
    From what I understand (since writing this answer) the free tier only applies to your default bucket. Since the containers are stored in other buckets, they don't fall under the free tier. Also see Doug's answers here: https://stackoverflow.com/questions/63893413/firebase-storage-uses-490mb-but-i-have-no-buckets/63893563#63893563 and here: https://stackoverflow.com/questions/63884429/how-to-delete-outdated-firebase-cloud-function-containers-from-gc-storage/63888296#63888296 – Frank van Puffelen Sep 26 '20 at 18:34
  • 1
    @FrankvanPuffelen I am also facing the issue, What would happen if I delete all there us.artifacts buckets? By now, I'm only using cloud functions to store average ratings in documents. As I understand, If someone rates an item, the whole collection duplicated in us,artificats or what is really happening here. is there an article which related to this or a youtube video who has done by a firebase expert, please share. – Isuru Bandara Sep 30 '20 at 18:25
  • 1
    @FrankvanPuffelen My storage usage is 100MB, and my artifacts usage is 14 **GB**. Node 12 firebase functions. I am guessing a Lifecycle Rule was supposed to be put on the artifacts bucket but this didn't happen? I see the "staging" bucket has one set at delete after 15d if no update. Shouldn't the same policy be on artifacts? – xaphod Nov 19 '20 at 20:30
  • @FrankvanPuffelen "you should no longer get additional artifacts", but seems that you have to remove the old ones (or at least my deploys are not removing old ones) – Lotan Sep 03 '21 at 12:00
  • 1
    @Lotan The script that I also linked in my answer allows you to to clean up old artifacts yourself once, and then the CLI will ensure no new ones are created anymore. – Frank van Puffelen Sep 03 '21 at 13:59
  • 1
    Artifacts doesn't delete automatically when deploying stripe extension. – hieudev develo Oct 22 '21 at 23:19
  • 3
    +1 for me too, the Artefacts does not get deleted with subsequent deploy of functions. I am using Firebase CLI - 9.23.0 Have handful of functions (~10) but when checked in GCP Console (not firebase) can see that there are 228 artefacts of varying sizes. – pjoshi Feb 05 '22 at 12:44
  • *firebaser here* From what I understand, Extensions might not be cleaning artifact yet. If that's what you see, I recommend filing an (or upvoting on an) issue on the Extensions repo. – Frank van Puffelen Feb 06 '22 at 05:12
  • 3
    I am using version 10+ and it still collects artefacts when deploying functions. – showtime Mar 26 '22 at 22:50
  • 1
    @FrankvanPuffelen I've recently discovered that the free tier only applies for buckets located in US-Central1, US-West1, and US-East1: https://cloud.google.com/storage/pricing#cloud-storage-always-free – Adam Bridges Aug 25 '22 at 00:47
35

I've consulted GCP support and here are a few things

  • Cloud Functions caused the surge in storage usage
  • Since these artifacts are not stored in the default bucket, they'll charge you even if your total bytes stored are not reaching the free tier limit
  • Remove the artifact bucket at https://console.cloud.google.com/storage/browser. According to the support staff

Regarding the artifacts bucket, you can actually get rid of them, as they are storing previous versions of the function. However, I do not recommend deleting the "gcf-sources..." bucket(s) , as it contains the current image, so deleting this bucket would mess up your function.

I tried to remove it in whole, and so far it is not causing trouble. I'll update if it break things later.


Edit 201118: See comment below and you might need to keep the bucket while removing all the content in it.

yo1995
  • 471
  • 3
  • 7
  • 1
    In my case, gcf-sources takes about 99.3KB, the main problem is the other "us.artifacts..." which is using about 500MB so far. So is this generated on every functions deployment? @yo1995 – Mr. DMX Oct 06 '20 at 01:17
  • @Mr.DMX I'm not sure, but I assume so. Also after cleaning up the artifacts bucket it took 3 days for the Firebase dashboard to refresh... But eventually it displayed fairly low usage. – yo1995 Oct 11 '20 at 20:51
  • can the artifact clean up automatically? – Weijing Jay Lin Oct 14 '20 at 21:19
  • 4
    @WeijingJayLin They should, but it seems Firebase developers are still working on it. Per support staff > Our engineering team is working hard on the automatic deletion, I'd suggest keeping an eye on the release notes or our blog for new features and improvements to the platform. – yo1995 Oct 15 '20 at 02:53
  • 2
    I deleted the artifacts and now I can no longer deploy new functions. I am getting: Deployment error. Build failed: Build error details not available. Additionally, in the Logs it tells me that there is a 404 on the artifacts. Any solutions? – Thomas Nov 09 '20 at 10:56
  • see answer from @decko – giorgio79 Dec 04 '20 at 05:52
  • Confirming that you should NOT DO THIS. you'll get errors now or later when deploying functions, like this: `ERROR: build step 3 "us.gcr.io/fn-img/buildpacks/nodejs12/builder:nodejs12_20201201_20_RC00" failed: step exited with non-zero status: 46` – xaphod Dec 10 '20 at 23:25
  • @Thomas did you find a solution for this? A Google billing support representative told me to delete the bucket and now I'm stuck unable to deploy – notquiteamonad Dec 31 '20 at 18:53
  • 3
    @samueldple Waiting solved the issue for me. But I contacted the support and this was the response: "One option to workaround the issue is by deploying the functions individually. Then after the function image is set, you can deploy all of them again. Deleting the images is optional, the one day object lifetime is fine, you can workaround the issue by deploying the functions individually. Just keep in mind that since the function image is not found the deployment may sometimes have some issues like this one." – Thomas Jan 01 '21 at 19:25
  • Probably do not delete whole bucket but just contents... possibly try keeping latest date's contents there... – pjoshi Feb 05 '22 at 13:32
18

Adding to @yo1995
I consulted with Firebase Support and they confirmed that the artifacts bucket should not be deleted. Basically the artifacts are used to help build the final image to be stored in the "gcf-sources" bucket.

To quote them directly
"you are free to delete the contents in "XX.artifacts", but please leave the bucket untouched, it will be used in the following deployment cycles."

There might be some unintended behaviour if you delete the artifacts bucket entirely.
Also "The team is working to clean up this bucket automatically, but there are some restrictions that they need to solve before publishing the solution."

For the time being I set the bucket to auto-delete files older than 1 day old.

decko
  • 181
  • 1
  • 4
  • 1
    You should NOT delete these. I had a 7 day delete lifecycle window and I had errors deploying like these: `ERROR: build step 3 "us.gcr.io/fn-img/buildpacks/nodejs12/builder:nodejs12_20201201_20_RC00" failed: step exited with non-zero status: 46` – xaphod Dec 10 '20 at 23:26
  • @xaphod That's really weird. I have mine set to 1 day delete lifecycle and my functions are deploying fine (aus region, multi region usa, multi region asia). I even purposely tested with a freshly wiped artifacts buckets and the deployment remains unaffected. (cloud functions also operate fine) I think the cause of your error might be something else. – decko Dec 15 '20 at 04:19
  • are you using node 12 functions? – xaphod Dec 15 '20 at 14:15
  • are you using node 12 functions? – xaphod Dec 15 '20 at 14:16
  • @xaphod Yes, my functions are Node 12 – decko Dec 15 '20 at 22:28
  • there are others with the same error as me, see tech's comment in answer above. I would advise NOT deleting anything in artifacts, which makes it annoying that it can grow really fast. I had 14GB of artifacts I was paying for... – xaphod Dec 20 '20 at 22:17
  • I've found that if you delete everything from the bucket, it's rebuilt properly the next deployment. Lifecycle deletion may only partially delete files, confusing the firebase deployment tool, causing those errors. – Marcus Cemes Dec 31 '20 at 14:30
16

EDIT early 2022: This whole answer is now moot. It may have worked in the past, but the actual root cause of the problem is now fixed in the Firebase CLI.

How to reduce storage

So there is a great answer to the issue but the solution as to how to fix it requires further deep diving.

To help future developers cut right to the chase, here is the result you should see after adding the following rules to your project in GCP

completely cleaned up artifact storage

The orange line is the us-artifacts.<your-project>.appspot.com bucket.

Steps to fix the issue

  1. Navigate to https://console.cloud.google.com/
  2. Open the GCP project that corresponds to the Firebase project
  3. In the menu, choose Storage -> Browser Navigation menu
  4. Click on the offending us-artifacts.<your-project>.appspot.com bucket
  5. Go to the 'Lifecycle' tab and add a life span of 3 days
  • Add a rule
  • Delete Object
  • Age, 3 Days lifecycle rule NB: Results will not appear on the usage graph until about 24 hours later

Caveat

Firebase uses containers that back reference previous containers, so if you set a period of 3 days and your firebase deploy functions start failing, you will need to update the local name of your function to include versioning, and either specify a build flag to delete old versions, remove them from your firebase.json, or manually delete obsolete functions.

Using versioned API type functions

In your entrypoint, assuming index.ts, and assuming you have initilaised firebase with

admin.initializeApp(functions.config().firebase)
import * as functions from 'firebase-functions'

// define the app as a cloud function called APIv1 build xxxxxx
export const APIv1b20201202 = functions.https.onRequest(main)

where main is the name of your app

and in your firebase.json

...
"hosting": {
    "public": "dist",
    "ignore": ["firebase.json", "**/.*", "**/node_modules/**", "**/tests/**"],
    "rewrites": [
      {
        "source": "/api/v1/**",
        "function": "APIv1b2021202"
      }
    ]
  },
...

Or, to Manually Update

# Deploy new function called APIv11
$ firebase deploy --only functions:APIv11

# Wait until deployment is done; now both APIv11 and APIv10 are running

# Delete APIv10
$ firebase functions:delete APIv10
Beerswiller
  • 659
  • 6
  • 8
  • 3
    The caveat part is interesting. In what cases would firebase deploy functions start failing? I would like to avoid this versioning magic. – giorgio79 Dec 04 '20 at 05:57
  • 2
    The build containers use layered files to efficiently cache your function execution environment. Some of those caches seem to have a validity of several days or possibly weeks, so the function deploy will look for the cached version if it is still valid. If you deleted it (and you can't tell firebase you deleted it) the build fails. Versioning simply forces a complete rebuild of the execution environment – Beerswiller Dec 08 '20 at 00:25
  • As you said the actual cause is now fixed in firebase CLI but I am facing the exact same problem since last should , I am using the latest version of firebase CLI so what should go on and delete the us.artifacts bucket? – Gaurav Apr 16 '22 at 06:26
  • Version your function(s) and apply the lifecycle rules if it is still happening – Beerswiller Apr 20 '22 at 12:10
15

Adding to @yo1995's response, you can delete the artifacts in the bucket without needing to go into GCP. Staying in Firebase, you go to Storage, then "Add a Bucket". From there, you will see the option to import the gcp and artifact buckets. Next, you can delete the artifacts in the buckets accordingly.

Per some comments received, it's important not to delete the bucket. Rather, delete the artifacts in the bucket only!

Artifact Bucket Location Picture

Mike Altonji
  • 353
  • 2
  • 8
  • 1
    Thanks for this. I can't understand why they would hide it like that. I'm glad I looked at my usage while debugging. My active file storage is maybe 5mb, but the artifact storage was well over 700mb. – elersong Oct 12 '20 at 22:21
  • 1
    @elersong same here, I'm a week away from deploying, I saw 1.7GB of usage in storage I was shocked only to find out it's artifacts. – King Of The Jungle Nov 27 '20 at 12:15
  • 1
    deleted successfully and don't see any errors so far – Stanislau Buzunko Dec 22 '20 at 13:26
  • 1
    I have tried first deleting some old ones and keeping some others, that caused the build to fail. But if you remove all the files then you won't have issues. – Leandro Zubrezki Feb 19 '21 at 13:04
4

Firebase said they have released a fix (as of June 2021):

https://github.com/firebase/firebase-tools/issues/3404#issuecomment-865270081

Fix is in the next version of firebase-tools, which should be coming today.

To fix:

  1. Run npm i -g firebase-tools.

  2. Browse your contains in Cloud Storage https://console.cloud.google.com/storage/browser/ (look for a bucket named gcf-sources-*****-us-central1)

  3. Any deleted functions via firebase deploy --only functions seem to remove artifacts automatically, but if you delete them through the UI, they artifacts remain.

d-_-b
  • 21,536
  • 40
  • 150
  • 256
1

After some research and emailing with the firebase team, this is what was suggested to me.

We are aware that Cloud Build is not automatically deleting old artifacts so it's size keeps on increasing, as a workaround I’d recommend deleting the files inside the bucket in order to reduce any possible charges.

You can delete the files into the mentioned buckets going to the GCP console (use the same credentials as Firebase Console) -> Select the correct project -> From the left upper corner menu select Storage -> Browser. You will see all the buckets that belong to your project, click on the bucket you prefer, and you can delete the files from there.

One other option that you may try is managing the bucket's object lifecycles. There is an option to delete objects when they meet all conditions specified in the lifecycle rule, here is a link with one example about this option. In this way, the bucket objects will be deleted automatically.

1

I have created a configuration file I named storage_artifacts_lifecycle.json with contents:

{
  "lifecycle": {
    "rule": [
      {
        "action": { "type": "Delete" },
        "condition": {
          "age": 21
        }
      }
    ]
  }
}

I configure my storage lifecycle with the command:

gsutil lifecycle set ./firebase/storage_artifacts_lifecycle.json gs://us.artifacts.${MY_PROJECT_ID}.appspot.com

and I validate its results after running with

gsutil lifecycle get gs://us.artifacts.${MY_PROJECT_ID}.appspot.com

Hope this helps some!

Greg Fenton
  • 1,233
  • 9
  • 15
1

Adding to @d-_-b

As of 7th July 2022

It is now announced on firebase support page as well:

If you are seeing unexpected amounts of Cloud Storage used, this is likely caused by a known issue with the cleanup of artifacts created in the function deployment process.

This issue is now resolved; if you are still seeing unexpected usage, update the Firebase CLI and re-deploy your Cloud Functions.

Pekehata
  • 27
  • 6
0

I did a bit of research on the topic and find the optimal solution for me - a script that I run before each deploy of my Firebase functions. The script scans my container images and:

  • Keeps the ones with latest tag.
  • Deletes all the images except the last too.

This approach is semi-automated. The storage anyway grows only when I deploy so it works really well for me.

The script is written in JavaScript for environment with node and gcloud cli available.

const spawn = require("child_process").spawn;

const KEEP_AT_LEAST = 2;
const CONTAINER_REGISTRIES = [
  "gcr.io/<your project name>",
  "eu.gcr.io/<your project name>/gcf/europe-west3"
];

async function go(registry) {
  console.log(`> ${registry}`);
  const images = await command(`gcloud`, [
    "container",
    "images",
    "list",
    `--repository=${registry}`,
    "--format=json",
  ]);
  for (let i = 0; i < images.length; i++) {
    console.log(`    ${images[i].name}`);
    const image = images[i].name;
    let tags = await command(`gcloud`, [
      "container",
      "images",
      "list-tags",
      image,
      "--format=json",
    ]);
    const totalImages = tags.length;
    // do not touch `latest`
    tags = tags.filter(({ tags }) => !tags.find((tag) => tag === "latest"));
    // sorting by date
    tags.sort((a, b) => {
      const d1 = new Date(a.timestamp.datetime);
      const d2 = new Date(b.timestamp.datetime);
      return d2.getTime() - d1.getTime();
    });
    // keeping at least X number of images
    tags = tags.filter((_, i) => i >= KEEP_AT_LEAST);

    console.log(`      For removal: ${tags.length}/${totalImages}`);
    for (let j = 0; j < tags.length; j++) {
      console.log(
        `      Deleting: ${formatImageTimestamp(tags[j])} | ${tags[j].digest}`
      );
      await command("gcloud", [
        "container",
        "images",
        "delete",
        `${image}@${tags[j].digest}`,
        "--format=json",
        "--quiet",
        "--force-delete-tags",
      ]);
    }
  }
}

function command(cmd, args) {
  return new Promise((done, reject) => {
    const ps = spawn(cmd, args);
    let result = "";

    ps.stdout.on("data", (data) => {
      result += data;
    });

    ps.stderr.on("data", (data) => {
      result += data;
    });

    ps.on("close", (code) => {
      if (code !== 0) {
        console.log(`process exited with code ${code}`);
      }
      try {
        done(JSON.parse(result));
      } catch (err) {
        done(result);
      }
    });
  });
}

function formatImageTimestamp(image) {
  const { year, month, day, hour, minute } = image.timestamp;
  return `${year}-${month}-${day} ${hour}:${minute}`;
}

(async function () {
  for (let i = 0; i < CONTAINER_REGISTRIES.length; i++) {
    await go(CONTAINER_REGISTRIES[i]);
  }
})();

It runs the following commands:

# finding images
gcloud container images list --repository=<your repository>

# getting metadata
gcloud container images list-tags <image name>

# deleting images
gcloud container images delete <image name>@<digest> --quiet --force-delete-tags

A blog post describing my findings is available here https://krasimirtsonev.com/blog/article/firebase-gcp-saving-money

Krasimir
  • 13,306
  • 3
  • 40
  • 55
-1

As an alternative, You can create a life Cycle rule to delete the objects inside the folder. set the age as 1 day. So it will delete all objects in the folder which is more than 1 day aging. lifeCycle rulw

SetCondition

Ankur
  • 1
  • 2
    This definitely breaks things later. You'll get errors on functions deploying, like this: "ERROR: build step 3 "us.gcr.io/fn-img/buildpacks/nodejs12/builder:nodejs12_20201201_20_RC00" failed: step exited with non-zero status: 46" – xaphod Dec 10 '20 at 23:25
  • 1
    I had added a 1 day life cycle rule and got the error @xaphod mentioned. Would not recommend doing this as I had to delete all my functions and redeploy them one by one - lots of downtime :( – Tech Dec 20 '20 at 18:05