0

multiple JS files (hosted by me) are embedded in various website's code as script tags to which I don't have access to. So, I couldn't invalidate cache by maintaining a fingerprint in the URL, or through URL parameters.

The files are quite large, so caching is mandatory. What are other ways there to invalidate browser cache apart from modifying the src?

The solution I'm considering now is to maintain a wrapper script which is not cached, that calls the actual scripts. By this way, I could maintain some sort versioning in the wrapper script. This does not seem elegant, are there other ways?

If these don't work out, I'll consider reducing the expires header.

I've done other optimisations like minifying, using a CDN etc.

Gautham Kumaran
  • 421
  • 5
  • 15

2 Answers2

1

Your solution to use a wrapper script is probably the optimum solution.

One mains script in the document DOM that calls your wrapper script to attach the appropriate scripts on load.

You could use something like requirejs to manage this which also has some cache busting built in according to this: Prevent RequireJS from Caching Required Scripts

Pandelis
  • 1,854
  • 13
  • 20
0

that calls the actual scripts

Why not just return a 302 response from the entry point URL?

(a 302 response can itself be cached - in your scenario this should be for a short period)

symcbean
  • 47,736
  • 6
  • 59
  • 94
  • I'm not sure how a 302 redirection will help in this case, can you elaborate? – Gautham Kumaran Sep 03 '18 at 12:42
  • Your partners point to http://yoursite.com/content, that URL returns a cacheable (and proxy-cacheable) 302 response which (say) expires on the following Thursday (i.e. 7 day TTL) the redirect location is a versioned URL http://yoursite.com/content.20180903a which is cacheable indefinitely. This has a higher cost on the first hit (1 additional RTT) each week but means you only have the bandwidth cost when the version changes. – symcbean Sep 03 '18 at 13:33