3

In my node.js server I have a post request making 2 uploads to google cloud storage--each into a different bucket.

When I tested the functionality, I was able to successfully upload 2 files to 2 different buckets, but upon the next test, 1 of the 2 uploads is failing and throwing an error: Error: Could not load the default credentials.

Why would it fail on the second test on only 1 of the uploads?
Why would it say the credentials can't be loaded if it's a fully public bucket (all users have full object access to read/write/admin)?

app.post("/upload", upload.single("image"), (req, res) => {
    //this takes an image file and uploads it
    async function uploadFile() {
        await storage.bucket(bucketName).upload(imagePath, {destination: imagePath})
     }

    uploadFile().catch(console.error);

    //this resizes the image 
    Jimp.read(imagePath, (err, img) => {
        if (err) throw err
        img.resize(Jimp.AUTO, 300).write(imgTitle + "-thumb.jpg")
    })

    //this uploads the resized image
    async function uploadThumb() {
      await storage.bucket(thumbs).upload(imgTitle + "-thumb.jpg", {destination: cat + "/" + subCat + "/" + imgTitle + "-thumb.jpg"})
    }
    setTimeout(() => {    //this timeout waits 2 seconds for JIMP to finish processing the image
        uploadThumb().catch(console.error);
    }, 2000)
});

I'm hoping someone can explain why this stopped working after the first test. The function that uploads the resized image works in both tests, but the function that uploads the original file fails on the 2nd test throwing the error: Error: Could not load the default credentials

UPDATE

After many tests, I have possibly deduced that this is a file size issue. The thumbnail upload works every time, while the full size image fails when its size reaches ~2-3MB. Reading the GCS docs, it says that 5TB is the maximum single file upload limit, so I don't know why there is an issue with a few MB. I do not want to lower the image size/resolution as they are art works that will need to be viewed at full size (that's exactly why I'm creating the thumbnails in the first place).

PsiKai
  • 1,803
  • 1
  • 5
  • 19
  • Hi @PsiKai have you confirmed that your second bucket is indeed wtih public access configured? Considering the fact that the function works perfectly and the issue is only with that bucket, it seems that the issue is with the permissions there. Could you please confirm if the second bucket is indeed public? You can also give it a try the method to make it public available in this case [here](https://stackoverflow.com/a/26041536/12767257). – gso_gabriel Sep 23 '20 at 11:58
  • Yes, I am 99.98% sure they are both public, and also I added full permission for "all users" to read and write objects in both buckets, as well as admin bucket privileges for both... upon further testing it looks like about half of the images I try to upload have the same issue and the other half work fine. Image size doesn't appear to matter, and they are all jpeg. I'm wondering if having 2 async functions at the same time is causing the problem, but I don't know enough about async to be sure.. perhaps a timing issue – PsiKai Sep 23 '20 at 15:59
  • The `async` might be an issue indeed, depending on how your code is written. Could you please give it a try executing one per time, without both being async? Besides that, you can get more details on executing paralel functions [here](https://stackoverflow.com/questions/35612428/call-async-await-functions-in-parallel). – gso_gabriel Sep 30 '20 at 05:46
  • I just tried to run the faulty function not async and I received the same `default credentials could not be loaded` error. So that might not be it. – PsiKai Sep 30 '20 at 06:24

2 Answers2

1

It looks like you're credentials are not loaded. Service credentials that is. Have you already tried running?

gcloud auth application-default login

Also, are the bucket permission identical for both?

Nils Groen
  • 51
  • 7
  • The permissions are identical for both, yes. I haven't attempted to load any credentials, because I thought with public access i wouldn't need to. I'm doing the project outside of the google development environment, so it's difficult to get instructions for independent projects. Can I run such a command in my Hyper terminal? Is it as simple as typing that command? – PsiKai Sep 23 '20 at 16:01
  • If you've installed the gcloud SDK, yes. It will open your browser and ask you to login. – Nils Groen Sep 24 '20 at 22:30
  • I installed SDK and logged in but now I'm not able to upload anything to storage and the app crashes immediately saying `Error: Unable to detect a Project Id in the current environment.` What do I need to do here? – PsiKai Oct 01 '20 at 05:57
  • Hi @PsiKai you need to first run the command `gcloud config set project [PROJECT_ID]` to set a project that you will be configuring the credentials. Replace `[PROJECT_ID]` for the id of your project, that it's using Cloud Storage. – gso_gabriel Oct 01 '20 at 12:03
  • @gso_gabriel I followed your instructions but it is still giving the same error of `Error: Unable to detect a Project Id in the current environment.` and not letting me upload or even list files. – PsiKai Oct 01 '20 at 16:54
  • I fixed the issue by creating `const projectId = 'project-id'` and `const keyFileName = 'client_secrets.json'` and passing those variables as parameters in `const storage = new Storage();` – PsiKai Oct 01 '20 at 23:51
  • That's nice @PsiKai ! Please, post an answer yourself, adding exactly the steps and information that you needed, to fix the issue. This way the Community has a confirmation that the issue is fixed. – gso_gabriel Oct 02 '20 at 05:37
  • @gso_gabriel Can you look at my question here: https://stackoverflow.com/questions/64186958/node-js-google-cloud-storage-credentials-not-loading-in-production It is the same problem that was fixed in dev but it is now reoccurring in production. – PsiKai Oct 03 '20 at 19:20
1

This issue was resolved by me through thorough research and trial and error (weeks, I'm not proud). I knew I needed to add verification parameters into my const storage = new Storage(); (not in my question, and I know now it should have been).

Originally I was tyring to use a .env file to pass in the project-id and client_secrets.json file in every way I knew how. But after rereading the documentation for the nth time, I came across the correct syntax for it.

I needed to create constants of the project-id and the location of the .json file it resides in and pass those in as parameters like this:

const projectId = 'project-id';
const keyFileName = "client_secret.json";
const storage = new Storage({projectId, keyFileName});

The reason the uploads were failing some of the time was because I was getting some sort of a free pass to upload smaller objects, but as soon as the uploaded file size reach ~3 mb it would require verification that was not present. I still don't fully understand it, but this is how I solved it.

PsiKai
  • 1,803
  • 1
  • 5
  • 19
  • It should be noted that while this fixed my issue in the development environment, the same problem occurred once I deployed the site to production. – PsiKai Oct 03 '20 at 19:21