2

Question: I have created and trained a keras model in tf 2.3.0 and I need to load this model in tf 1.12.0 in order to be used with a library that requires an older version of tf. Is there any way to either convert the models from the format of the new version of tf to an older version so I can load the model with tf 1.12.0?

What I have tried so far: A similar discussion showed how to convert models from tf 1.15 - 2.1 to tf.10, but when I tried this solution I got an error "Unknown layer: functional". Link: Loading the saved models from tf.keras in different versions

I tried to fix this by using the following line suggested by another question:

new_model = tf.keras.models.model_from_json(json_config, custom_objects {'Functional':tf.keras.models.Model})

Link: ValueError: Unknown layer: Functional

However, if I use this the I get an error: ('Unrecognized keyword arguments:', dict_keys(['ragged'])) , which is the same error discussed in the first discussion I linked above.

Another method I tried was using the Onnx library to convert the keras model to an Onnx model and then back to a keras model of a different version. However, I soon realized that the keras2onnx library required tf 2.x.

Links: https://github.com/onnx/tensorflow-onnx and https://github.com/gmalivenko/onnx2keras

Any suggestions about how to get around this without having to retrain my models in a older version of tensorflow would be greatly appreciated! Thanks

Here is the simple code that I tried to implement to load my model:

Save in tf 2.3.0

import tensorflow as tf

CNN_model=tf.keras.models.load_model('Real_Image_XAI_Models/Test_10_DC_R_Image.h5')

CNN_model.save_weights("Real_Image_XAI_Models/weights_only.h5")

json_config = CNN_model.to_json()

with open('Real_Image_XAI_Models/model_config.json', 'w') as json_file:
    json_file.write(json_config)

Load in tf 1.12.0

with open('Real_Image_XAI_Models/model_config.json') as json_file:
    json_config = json_file.read()

new_model = tf.keras.models.model_from_json(json_config)

#or implement the line to acount for the functional class

#new_model = tf.keras.models.model_from_json(json_config, custom_objects={'Functional':tf.keras.models.Model})

new_model.load_weights('Real_Image_XAI_Models/weights_only.h5')
jdsurya
  • 1,326
  • 8
  • 16
Lad4life
  • 23
  • 4

1 Answers1

0

There are breaking changes in the model config from tf-1.12.0 to tf-2.3.0 including, but not limited to, following:

  1. The root class Model is now Functional
  2. The support for Ragged tensors was added in tf-1.15

You can try to edit the model config json file once saved from tf-2.3.0 to reverse the effects of these changes as follows:

  1. Replace the root class definition "class_name": "Functional" by "class_name": "Model". This will reverse the effect of change #1 above.
  2. Delete all occurrences of "ragged": false, (and of "ragged": true, if present). This will reverse the effect of change #2 above.

Note the trailing comma and space along with the "ragged" fields above

You may try to find a way to make these changes programmatically in the json dictionary or at the model load time, but I find it easier to make these one-time changes to the json file itself.

jdsurya
  • 1,326
  • 8
  • 16
  • Thank you for you help! I tired what you suggested and replaced 'Functional' with 'Model' and deleted the ragged arguments from the .json file. However, I got another error that said< "('Keyword argument not understood:', 'groups')". Could you please explain if I did something wrong or if there is another change I need to make? – Lad4life Nov 17 '21 at 15:04
  • Is possible to share a part of the file before and after editing? I hope you deleted a comma (,) and a space along with "ragged" fields because I see a leading comma (,) in the error you shared. – jdsurya Nov 17 '21 at 15:22
  • Sure here are the lines I edited. Would anything else be helpful to identify the problem? Original: {"class_name": "Functional", "config": {"name": "functional_9", "layers": [{"class_name": "InputLayer", "config": {"batch_input_shape": [null, 224, 224, 3], "dtype": "float32", "sparse": false, "ragged": false, "name": "input_5"}, New version: {"class_name": "Model", "config": {"name": "functional_9", "layers": [{"class_name": "InputLayer", "config": {"batch_input_shape": [null, 224, 224, 3], "dtype": "float32", "sparse": false, "name": "input_5"}, – Lad4life Nov 17 '21 at 16:09
  • Looks good to me. I was able to load mine with just these two changes, however, it may vary with the model configuration.. if you would prefer to add the json file to the question then I can try to have a look – jdsurya Nov 17 '21 at 16:17
  • Sure I added a google link to the question to download the original .json file. I am pretty new to stack overflow and I couldn't find a better way to attach it. If you need me to send it in a different way let me know! – Lad4life Nov 17 '21 at 16:36
  • Ok, so I had a look and tried by deleting all ["groups": 1, ] (without brackets), and it loaded in tf-1.12. Similar to "ragged", it looks like a new support and tf-1.12.0 may be able to manage without it but if you go for it then you should test that everything is working fine after loading.. it is not the cleanest solution of all :-) – jdsurya Nov 17 '21 at 16:56
  • So I tried this solution and it loaded the .json file however I got another error when I tried to apply the weights with an .h5 file. This is the error: original_keras_version = f.attrs['keras_version'].decode('utf8') AttributeError: 'str' object has no attribute 'decode' . Also it is recommended to avoid extended discussions in the comments. Would you be willing to give me an email to contact you with? Or is there another way we could continue to communicate if you are willing to help? – Lad4life Nov 17 '21 at 18:04
  • It seems you need to downgrade your h5py. This should resolve your issue - https://stackoverflow.com/questions/53740577/does-any-one-got-attributeerror-str-object-has-no-attribute-decode-whi. Also, added email to my profile :-) – jdsurya Nov 18 '21 at 07:38