0

Two days after having manually deleted all the objects in a multi-region Cloud Storage bucket (e.g. us.artifacts.XXX.com) without Object Versioning I noticed that the bucket size hadn't decreased at all. Only when trying to delete the bucket I discovered that it actually stills containing the objects that I had presumably deleted.

Why aren't those objects displayed in the bucket list view, even when enabling Show deleted data?

enter image description here

enter image description here

enter image description here

Carles Capellas
  • 170
  • 2
  • 14
  • You likely enabled [Object Versioning](https://cloud.google.com/storage/docs/object-versioning) – DazWilkin Jan 13 '22 at 17:31
  • Object versioning is disabled – Carles Capellas Jan 13 '22 at 18:23
  • 1
    If Objects were created after it has been enabled and before it was disabled, they would have been versioned. So, a more accurate comment would have been "Did you have Object Versioning enabled anytime Objects were created?" – DazWilkin Jan 13 '22 at 18:29
  • Does this answer your question? [Google Cloud Storage - Bucket objects disappeared](https://stackoverflow.com/questions/69245007/google-cloud-storage-bucket-objects-disappeared) – Osvaldo Jan 13 '22 at 22:19
  • @DazWilkin Fair point! Now Object Versioning is disabled and I haven't enabled it myself at any point of time. However this bucket was created automatically by GCP when enabling Cloud Functions in a Firebase project. In the case that Firebase deployments could be enabling/disabling Object Versioning, your suggestion would be a valid hypothesis. – Carles Capellas Jan 14 '22 at 10:23
  • @OsvaldoLópezAcuña That's a different issue, but thanks for the suggestion – Carles Capellas Jan 14 '22 at 10:24

2 Answers2

1

When deploying a Function for the first time, two buckets are created automatically:

  1. gcf-sources-XXXXXX-us-central1

  2. us.artifacts.project-ID.appspot.com

You can observe these two buckets from the GCP Console by clicking on Cloud Storage from the left panel.

The files you're seeing in bucket us.artifacts.project-ID.appspot.com are related to a recent change in how the runtime (for Node 10 and up) is built as this post explains.

I also found out that this bucket doesn't have object versioning, retention policy or any lifecycle rule. Although you delete this bucket, it will be created again when you deploy the related function, so, if you are seeing unexpected amounts of Cloud Storage used, this is likely caused by a known issue with the cleanup of artifacts created in the function deployment process as indicated here.

Until the issue is resolved, you can avoid hitting storage limits by creating an auto-deletion rule in the Cloud Console:

  1. In the Cloud Console, select your project > Storage > Browser to open the storage browser.
  2. Select the "artifacts" bucket from the list.
  3. Under the Lifecycle tab, add a rule to auto-delete old images. Choose a deletion interval that works within your normal rate of deployments.

If possible, try to reproduce this scenario with a new function. In the meantime, take into account that if you delete many objects at once, you can track deletion progress by clicking the Notifications icon in the Cloud Console.

In addition, the Google Cloud Status Dashboard provides information about regional or global incidents affecting Google Cloud services such as Cloud Storage.

Osvaldo
  • 473
  • 1
  • 12
0

Nevermind! Eventually (at some point between 2-7 days after the deletion) the bucket size decreased and the objects are no longer displayed in the "Delete bucket" dialog.

enter image description here

Carles Capellas
  • 170
  • 2
  • 14
  • Hi, @Carles Capellas. Please describe or share any link on how you create your bucket automatically when enabling Cloud Functions in a Firebase project to understand why it takes some time from the moment you delete objects until they are actually deleted. There are some object’s [lifecycle configurations](https://cloud.google.com/storage/docs/lifecycle) and [retention policies](https://cloud.google.com/storage/docs/bucket-lock#retention-policy) that might have delayed this action. – Osvaldo Jan 21 '22 at 17:22
  • Hi @OsvaldoLópezAcuña. All I did was enabling Cloud Functions from the Firebase console. I didn't even know about the existance of this Bucket until I started noticing the corresponding billing charges due to Bucket size exceeding the free storage quota. – Carles Capellas Jan 24 '22 at 11:25
  • I’ve tried to replicate what you comment by creating a new Firebase project which is related to a GCP project. When I click the _Get Started_ button to activate Functions from Firebase Console, I can’t find any new bucket in my GCP Cloud Storage explorer. Did you do this or did you follow another path? Did you run `firebase init` in the CLI and, from there, did you do anything else like `Functions: Configure a Cloud Functions directory and its files`? – Osvaldo Jan 24 '22 at 22:44
  • I understand that the bucket will be created the first time you deploy your functions. I did so with `firebase init` and `firebase deploy` from a Firebase project containing at least one function. Should not make a difference to this matter, but my functions were written in Typescript – Carles Capellas Jan 26 '22 at 09:50
  • FYI, the number of objects in the modal window comes from the metrics (check the dev tools to see the request). Not sure if it has any relevance for this question, but if there's a delay in the metrics, it could be confusing. – Jofre Jan 27 '22 at 17:47