So I'm using a very simple CDN service. You point to your website and if you call it through their HostName they'll cache it for you after the first call.
I use this for all my static content, like JavaScript files and images.
This all works perfect - and I like that it has very little maintenance or setup cost.
Problem starts when rolling out new versions of JavaScript files. New JavaScript files automatically get a new hash if the files changes.
Because roll out over multiple instances is not simultaneously a problem occurs though. I tried to model it in this diagram:
In words:
- Request hits server with new version
- Requests Js file with new version hash
- CDN detects correctly that the file is not cached
- CDN requests the original file with the new hash from the load balancer
- loadbalancer serves request of CDN to a random server - accidently serving from a server with the old version
- CDN caches old version with the new hash
- everyone gets served old versions from the CDN
There are some ways I know how to fix this - i.e. manually uploading files to a seperate storage with the hash baked in, etc. But this needs extra code and has more "moving parts" that makes maintenance more complicated.
I would prefer to have something that works as seamlessly as the normal CDN behavior. I guess this is a common problem for sites that are running on multiple instances, but I can't find a lot of information about this.
What is the common way to solve this?
Edit
I think another solution would be to somehow force the CDN to go to the same instance for the .js file as the original html file - but how?