We have an application that contains among other things a sitemap. The sitemap has 10k entries and is about 1.5Mb in size. The application is run in Google Cloud Run in production, and in Docker locally.
Whenever we request the sitemap locally with 10k entries there are no issues. It just downloads fine. When we try this on Cloud Run however it's sometimes fine, but most of the time it just returns a chunk (not the same every time, sometimes about 200Kb, sometimes about 400Kb, sometimes about 1000Kb) and then stops. The HTTP connection remains open though until one minute has passed since the start of the request and then it times out. Both PHP-FPM and NGiNX don't give clear indication of what might be going on.
As a workaround we're now writing the file to disk and using X-Accel-Redirect
to tell NGiNX to serve the file directly. This works without fail (and is fast, about 400ms).
So, even though I'm glad it works now I'm left with two questions:
- Why wouldn't this work from PHP directly?
- Is there some cut-off point I need to know about after which this behaviour starts?
If it matters, we have fastcgi_buffering off
in NGiNX because of Google Cloud Run website timeouts when content length is between 4013-8092 characters. What is going on?. Other than that it's a typical NGiNX setup. gzip is enabled, but disabling it did not help.