I'm experimenting with caching between my .NET client and server. I'm seeing a seemingly random number of hits to an endpoint before WinInet decides to cache the result.
The .NET client makes requests using HttpWebRequest
:
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(uiTextBoxUrl.Text);
var policy = new RequestCachePolicy(RequestCacheLevel.CacheIfAvailable);
webRequest.CachePolicy = policy;
WebResponse webResponse = webRequest.GetResponse();
The server, implemented using ASP.net Web API, sets these CacheControl
headers:
response.Headers.CacheControl = new CacheControlHeaderValue
{
MaxAge =3600,
MustRevalidate = true,
Public = true,
Private = true
};
Using a test-harness with a button that sends the request to the endpoint, I can see that, even though CacheIfAvailable
is used, the response isn't immediately cached. By checking the debug output on my server, I'm seeing that a seemingly random number of hits (or more likely hit-count/elapsed time heuristics) need to be fired before the request is eventually cached. If I whack the test button quickly, it'll start caching after about 10 hits. If I click the button every 1 or 2 seconds, I've counted up to 25 hits before caching kicks in.
This is the response I'm seeing from Fiddler:
HTTP/200 responses are cacheable by default, unless Expires, Pragma, or Cache-Control headers are present and forbid caching.
HTTP/1.1 Cache-Control Header is present: public, must-revalidate, max-age=3600, private
private: This response MUST NOT be cached by a shared cache.
public: This response MAY be cached by any cache.
max-age: This resource will expire in 1 hours. [3600 sec]
must-revalidate: After expiration, the server MUST be contacted to verify the freshness of this resource.
HTTP/1.1 ETAG Header is present: "3488770a-8659-4fc0-b579-dcda9200a1c7"
I've read that HttpWebRequest
uses WinInet for caching, so I'm curious as to how WinInet determines when something needs to be cached, more specifically, why does it not cache on the first hit?