14

Proxy_cache_lock logic means that when NGINX receives couple of request simultaneously, it sends only one upstream and the rest waiting till the first one returns and insert to cache (wait time is as configure in proxy_cache_lock_timeout).

If the cache element expired, and the NGINX receives couple of request simultaneously all of them are proxied to the upstream.

Question: How can I configure the NGINX to have the same logic as proxy_cache_lock also when the cache element exist but expired?

I checked proxy_cache_use_stale but it is not what I'm looking for because its return the expired cache when updating and I need to wait till the answer return from upstream...

This is my current NGINX configuration file:

http {
    include         /etc/nginx/mime.types;
    default_type    application/octet-stream;
    access_log          /var/log/nginx/access.log  main;

    proxy_cache_path @MY_CACHE_PATH@; # this obviously has the path in my file 
    proxy_cache_use_stale updating;
    proxy_cache_lock on;
    proxy_cache_lock_timeout 15s;
    proxy_cache_valid 404   10s;
    proxy_cache_valid 502   0s;
    proxy_cache one;

    server {
            listen 80;
            proxy_read_timeout 15s;
            proxy_connect_timeout 5s;
            include locations.conf;
    }
}

I managed to achieve this behavior by changing the NGINX source code but I wonder if that can be achieve within configuration

dWinder
  • 11,597
  • 3
  • 24
  • 39
  • have you ever figured this out? – caiocpricci2 May 21 '20 at 19:35
  • No - that stale fix most of the case and for the expired I still going to upstream for all the request – dWinder May 28 '20 at 08:55
  • I thinks this is not possible with nginx cache. Maybe you can take a look into nusters cache feature [wait on|off|TIME](https://github.com/jiangwenyuan/nuster/#wait-onofftime-cache-only) – Aleksandar May 28 '20 at 13:40
  • I'm pretty sure this is a bug, a lock should be an actual lock! this isn't even a problem with multiple nginx workers, a single worker still can issue 2 requests when the cache has expired, despite there being a lock! – aliqandil Feb 28 '21 at 03:16
  • 1
    Would you be able to share the NGINX source changes you needed to make to allow for this behavior? – Francisco Garcia May 20 '21 at 21:44
  • Are you sure that the simultaneous requests to the upstream you are talking about are for the same key? Because the nginx documentation is clearly saying that `proxy_cache_lock on;` should be locking if the key is the same, no matter what other conditions you have. – Nikolay Dimitrov Feb 24 '23 at 03:18
  • Yes - same cache key. as you can see https://www.mail-archive.com/nginx@nginx.org/msg12044.html (from @pva answer) – dWinder Apr 10 '23 at 10:40

1 Answers1

1

This is expected behavior according to upstream. As Maxim told there, in the documentations it is said:

When enabled, only one request at a time will be allowed to populate a new cache element identified according to the proxy_cache_key directive by passing a request to a proxied server.

Now,

Note "a new cache element". It does not work while updating cache elements, this is not implemented.

That said, could you share the code changes that work for you?

pva
  • 1,937
  • 1
  • 11
  • 8
  • 1
    Thanks you for finding that source - good at least to know it is known. I was long ago so not so sure but I think it was changing https://github.com/nginx/nginx/blob/2485681308bd8d3108da31546cb91bb97813a3fb/src/http/ngx_http_file_cache.c#L646 to NGX_AGAIN (but not really remember...) – dWinder Apr 10 '23 at 10:41