So I have this NodeJS monorepo which is structured as follows:
monorepo/
├─ src/
│ ├─ libs/
│ │ ├─ types/
│ │ │ ├─ src/
│ │ │ ├─ package.json
│ ├─ some_app/
│ ├─ gcp_function/
│ │ ├─ src/
│ │ ├─ package.json
├─ package.json
Multiple projects use the same types library, so anytime a type changes, we can update all references at once. It has worked really great so far and I'm really happy with the structure.
Except now I needed to create a function on the Google Cloud Platform. In this function I also reference the types project.
The functions package.json
is as follows:
{
"devDependencies": {
"@project_name/types": "*",
"npm-watch": "^0.11.0",
"typescript": "^4.9.4"
}
}
The @project_name/types
refers to the src/libs/types
project. This works with every project, because we have a centralised build tool.
This means the function works locally on my machine and I have no problem developing it, but as soon as I push it to Google Cloud (using the command listed below) I get the following error:
npm ERR! 404 '@project_name/types@*' is not in this registry.
I use this command to deploy from ./monorepo
:
gcloud functions deploy gcp-function --gen2 --runtime nodejs16 --trigger-topic some_topic --source .\src\gcp_function
I think it's pretty clear why this happens:
Google Cloud only pushes the function, which then builds on Google Cloud Build. But because the rest of the monorepo doesn't get pushed, it doesn't work as it has no reference to @project_name/types
.
I've been struggling with this issue for quite a while now. Has anybody ever run into this issue and if so, how did you fix it? Is there any other way to use local dependecies for Google Cloud functions? Maybe there is some way to package the entire project and send it to Google Cloud?