When taking a page offline, the browser will (glossing over the details) download the page, its manifest, and its resources, and then when done it will re-request the manifest (Step 24 here), and if what it gets back doesn't match the first one it got, caching fails.
According to the spec, the browser should then schedule another attempt after a "short delay" (Step 25). Sadly, as far as I can tell, neither Chrome nor Firefox does that last part; instead, they just fail to cache (or update) the page and don't retry it.
So in order for pages to be reliably cached/updated, we need to make sure the second copy of the manifest is a byte-for-byte copy of the first one from just a few moments before.
If the data for the page already has some kind of built-in version you can use in the manifest, great, but if not it seems your options are:
Generate some kind of synthetic version and remember it on the server so that you can return the same version for the two requests.
Allow the browser to cache the manifest for a short time (say, up to two minutes) in its normal cache; that way, the second request for it is satisfied by the browser's normal cache, and thus is always byte-for-byte. Of course, that means you can't update the page within those two minutes.
I'm not seeing anything in the spec, but is there any way to avoid that second manifest load entirely, so as to avoid either option? To tell the browser that the point-in-time of the first manifest is fine, we don't want it to recheck after? (In our case, we know for sure the world won't change half-way through downloading the resources, so the reason for the second check doesn't apply to us.)