0

I'm working with Keras, as bundled with Tensorflow 2.1. I am trying many different model architectures. I have learned that Keras Model objects are difficult, if not impossible, to serialize using Python's pickle. Using JSON, of course, is out of the question.

Writing a Model object to disk using pickle.dump() does not fail. However, when you attempt to reload it, you've lost your model weights. I'm not sure why. To get around the problem, I've built a Python wrapper class for Keras Models which can reconstruct the model from data which can be serialized. I have to recompile the Model and set its weights once it has been reloaded from disk, but that's not a huge obstacle.

Now, I now want to do the same thing for an Optimizer that I've done for my models. I've already determined that JSON rejects Optimizers, and pickle accepts them. But I also know from working with Model objects that unpickling may not work.

I have a possibly-related problem with Tensorflow Dataset objects. I can create a subclass of Dataset, but I can only add attributes and NEW methods. I was unable to override an existing Dataset method (I wanted to modify Dataset.padded_batch()).

Before I waste time attempting to apply Pythonic approaches to Keras (Tensorflow) objects: does anyone know which of these objects can be serialized and/or subclassed? Is there a guide somewhere? For now, I only really need to know about Optimizers, but in the future I suppose I might try to save or subclass another object.

Thanks for your advice.

John Ladasky
  • 1,016
  • 8
  • 17
  • Does this answer your question? [Save and load model optimizer state](https://stackoverflow.com/questions/49503748/save-and-load-model-optimizer-state) – Toukenize Apr 20 '20 at 01:23
  • That's definitely related -- but a bit puzzling. I checked the version of Keras that is bundled with my Tensorflow, it's Keras v2.2.4-tf. According to the comment you posted, Keras 2.2.4 is supposed to allow Model pickling, but I did not succeed. Maybe the last time I tried this was before I upgraded Tensorflow? I will investigate further. I have workarounds, but they are kludgy and I would like to write idiomatic, streamlined code. – John Ladasky Apr 20 '20 at 02:14
  • From the Tensorflow Keras [documentation](https://www.tensorflow.org/tutorials/keras/save_and_load#save_the_entire_model), `model.save` saves the optimizer state as well. But it does state that `v1.x` optimizers can't be saved as they are not compatible, you might want to check if that describes your case. – Toukenize Apr 20 '20 at 05:42
  • Thanks for your continued replies. It looks as if `model.save` might finally do what I want. I am pretty sure that I haven't chosen `v1.x` optimizers, but I will check. I'm currently trying SGD with the Nesterov option, and Adam. I assume that Keras would have updated these standard and common optimizers to keep pace with other upgrades. The main issue is that I want to document the hyperparameters, but resuming training is something I may eventually explore. – John Ladasky Apr 20 '20 at 06:25

0 Answers0