0

Here are the request headers:

GET /url/ HTTP/1.1
Host: example.com
User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; rv:24.0) Gecko/20100101 Firefox/24.0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Accept-Language: ru-RU,ru;q=0.8,en-US;q=0.5,en;q=0.3
Accept-Encoding: gzip, deflate
DNT: 1
Cookie: data=ABC;
Connection: keep-alive
Cache-Control: max-age=0

And here is response from the server:

HTTP/1.1 200 OK
Date: Fri, 11 Oct 2013 23:28:32 GMT
Server: Apache/2.2.14 (Ubuntu)
X-Powered-By: PHP/5.3.2-1ubuntu4.18
Expires: Wed, 16 Oct 2013 03:28:32 GMT
Cache-Control: max-age=360000
Pragma: cache
Content-Language: en
Vary: Accept-Encoding
Content-Encoding: gzip
Content-Length: 844
Keep-Alive: timeout=15, max=92
Connection: Keep-Alive

No matter how many times I refresh the page and no matter how, it always makes a request (which takes 200ms+ since it's over HTTPS). Am I overseeing something? Should the response headers contain more parameters in order to prevent Firefox from making a server request?

Xeos
  • 5,975
  • 11
  • 50
  • 79

1 Answers1

0

I normally do something like add example.com/url/?nocache=<?php echo rand(); ?> which will add a random number to the url, making it a different url each time to prevent caching.

Ally
  • 955
  • 1
  • 15
  • 33
  • 1
    The OP does not want to _force_ a request and thereby bypass caching, he wants the exact opposite. – CBroe Oct 12 '13 at 01:04