0

I have a nodejs rest service deployed in k8s exposed with nginx ingress. It responds to a basic get, but when I pass a URL parameter I get a 502.

import express from "express";
const app = express();
app.get("/service-invoice", async (req, res) => {
  res.send(allInvoices);
}
app.listen(80);

Where allInvoices is just a collection of invoice objects loaded from MongoDB.

I deploy this to k8s with the following ingress config:

---
apiVersion: networking.k8s.io/v1
kind: Ingress
metadata:
  name: service-invoice-read
  namespace: ctx-service-invoice
  annotations:
    kubernetes.io/ingress.class: addon-http-application-routing
    nginx.ingress.kubernetes.io/rewrite-target: /$2
spec:
  rules:
  - http:
      paths:
      - path: /service-invoice-read(/|$)(.*)
        pathType: Prefix
        backend:
          service:
            name: service-invoice-read
            port:
              number: 80

Calling this with curl:

curl localhost:30000/service-invoice-read/service-invoice

I get back a valid json response. So far, so good.

But, I also want to access these objects by Id. To do that I have the following code:

app.get("/service-invoice/:id", async (req, res) => {
  try {
    const id = req.params.id;
    const invoice = // code to load invoice by id from mongo
    res.send(invoice);
  } catch (e) {
    res.status(sc.NOT_FOUND).send(error);
  }
});

Calling this with curl:

curl localhost:30000/service-invoice-read/service-invoice/e98e03b8-b590-4ca4-978d-270986b7d26e

Results in a 502 - Bad Gateway error.

I can't see any errors in my pod's logs, so I'm pretty sure this is coming from nginx.

I don't really understand where this is coming from. I've tried without the try/catch to see in the logs if it blows up and still no joy.

Here's my ingress logs, as requested in the comments:

2022/03/03 18:45:21 [error] 847#847: *4524 upstream prematurely closed connection while reading response header from upstream, client: 10.42.1.1, server: _, request: "GET /service-invoice-read/service-invoice/6220d042a95986f58c46356f HTTP/1.1", upstream: "http://10.42.1.100:80/service-invoice/6220d042a95986f58c46356f", host: "localhost:30000"
2022/03/03 18:45:21 [error] 847#847: *4524 connect() failed (111: Connection refused) while connecting to upstream, client: 10.42.1.1, server: _, request: "GET /service-invoice-read/service-invoice/6220d042a95986f58c46356f HTTP/1.1", upstream: "http://10.42.1.100:80/service-invoice/6220d042a95986f58c46356f", host: "localhost:30000"
2022/03/03 18:45:21 [error] 847#847: *4524 connect() failed (111: Connection refused) while connecting to upstream, client: 10.42.1.1, server: _, request: "GET /service-invoice-read/service-invoice/6220d042a95986f58c46356f HTTP/1.1", upstream: "http://10.42.1.100:80/service-invoice/6220d042a95986f58c46356f", host: "localhost:30000"
10.42.1.1 - - [03/Mar/2022:18:45:21 +0000] "GET /service-invoice-read/service-invoice/6220d042a95986f58c46356f HTTP/1.1" 502 150 "-" "curl/7.68.0" 140 0.006 [ctx-service-invoice-service-invoice-read-80] [] 10.42.1.100:80, 10.42.1.100:80, 10.42.1.100:80 0, 0, 0 0.004, 0.004, 0.000 502, 502, 502 b78e6879fabe2d5947525a2b694b4b9f
W0303 18:45:21.529749       7 controller.go:1076] Service "ctx-service-invoice/service-invoice-read" does not have any active Endpoint.

Does anyone know what I'm doing wrong here?

Software Engineer
  • 15,457
  • 7
  • 74
  • 102

1 Answers1

1

The problem wasn't what it seemed. In this case, the configuration is working fine. The real problem is that there was an error in the code that was being suppressed by a global exception handler without being logged. For some reason, this resulted in a 502 -- though I still don't understand why I got that exact response but I'm not specifically interested.

The aim of the global exception handler is to keep the service running when it would otherwise die. Given that a service dying in k8s is perfectly acceptable I've removed this handler and allowed the pod to die, which gives me a lot more information about what is going on.

Software Engineer
  • 15,457
  • 7
  • 74
  • 102