A follow up to this question:
How to save a Tensorflow Checkpoint file from Google Colaboratory in when using TPU mode?
Where the official way of saving a checkpoint when using a Tensorflow TPU is to use the Google Cloud Service.
I am working if there is a workaround to this for those who do not wish to use GCS. Perhaps for each variable, do a .eval(), save the variable. And then set the save variable to the 'init' value for each variable.
A major issue I foresee though is saving and loading the parameters for the optimizers.
For Keras, the weights do seem to be saved from TPU to local
INFO:tensorflow:Copying TPU weights to the CPU
So I imagine that there's a general workaround too, without using keras.