1

So I have this NodeJS monorepo which is structured as follows:

monorepo/
├─ src/
│  ├─ libs/
│  │  ├─ types/
│  │  │  ├─ src/
│  │  │  ├─ package.json
│  ├─ some_app/
│  ├─ gcp_function/
│  │  ├─ src/
│  │  ├─ package.json
├─ package.json

Multiple projects use the same types library, so anytime a type changes, we can update all references at once. It has worked really great so far and I'm really happy with the structure.

Except now I needed to create a function on the Google Cloud Platform. In this function I also reference the types project. The functions package.json is as follows:

{
  "devDependencies": {
    "@project_name/types": "*",
    "npm-watch": "^0.11.0",
    "typescript": "^4.9.4"
  }
}

The @project_name/types refers to the src/libs/types project. This works with every project, because we have a centralised build tool.

This means the function works locally on my machine and I have no problem developing it, but as soon as I push it to Google Cloud (using the command listed below) I get the following error:

npm ERR! 404 '@project_name/types@*' is not in this registry.

I use this command to deploy from ./monorepo:

gcloud functions deploy gcp-function --gen2 --runtime nodejs16 --trigger-topic some_topic --source .\src\gcp_function

I think it's pretty clear why this happens: Google Cloud only pushes the function, which then builds on Google Cloud Build. But because the rest of the monorepo doesn't get pushed, it doesn't work as it has no reference to @project_name/types.

I've been struggling with this issue for quite a while now. Has anybody ever run into this issue and if so, how did you fix it? Is there any other way to use local dependecies for Google Cloud functions? Maybe there is some way to package the entire project and send it to Google Cloud?

Jacob Kapitein
  • 245
  • 1
  • 4
  • 14

1 Answers1

0

I have resolved this issue by using a smart trick in the CI pipeline.

In the CI I take the following steps:

  1. Build the types and use npm pack to package it.
  2. Copy the compressed pack to the function folder. There install it using npm i @project_name/types@file:./types.tgz. That way it gets overridden in the package.json.
  3. Zip the entire GCP Function and push it to Google Cloud Storage.
  4. Tell Google Cloud Functions to build and run that zip file.

In the end my CI looks a bit like this:

steps:
  - name: Build types library
    working-directory: ./src/libs/types
    run: npm i --ignore-scripts && npm run build && npm pack
  - name: Copy pack to gcp_function and install
    run: cp ../libs/types/*.tgz ./types.tgz && npm i @project_name/types@file:./types.tgz
  - name: Zip the current folder
    run: rm -rf node_modules && zip -r function.zip ./
  - name: Upload the function.zip to Google Cloud Storage
    run: gsutil cp function.zip gs://some-bucket/gcp_function/function.zip
  - name: Deploy to Google Cloud
    run: |
      gcloud functions deploy gcp_function \
      --source gs://some-bucket/gcp_function/function.zip

This has resolved the issue for me. Now I don't have to publish the package to NPM (I didn't want to do that, because it doesn't fit in the monorepo narrative). And I can still develop the types which get updated live in every other project while developing.

Jacob Kapitein
  • 245
  • 1
  • 4
  • 14