I created a modified lenet model using tensorflow that looks like this:
img_height = img_width = 64
BS = 32
model = models.Sequential()
model.add(layers.InputLayer((img_height,img_width,1), batch_size=BS))
model.add(layers.Conv2D(filters=32, kernel_size=(3, 3), strides=(1, 1), batch_size=BS, activation='relu', padding="valid"))
model.add(layers.Conv2D(filters=64, kernel_size=(3, 3), strides=(1, 1), batch_size=BS, activation='relu', padding='valid'))
model.add(layers.MaxPooling2D(pool_size=(2, 2), strides=(2, 2), batch_size=BS, padding='valid'))
model.add(layers.Dropout(0.25))
model.add(layers.Conv2D(filters=128, kernel_size=(1,1), strides=(1,1), batch_size=BS, activation='relu', padding='valid'))
model.add(layers.Dropout(0.5))
model.add(layers.Conv2D(filters=2, kernel_size=(1,1), strides=(1,1), batch_size=BS, activation='relu', padding='valid'))
model.add(layers.GlobalAveragePooling2D())
model.add(layers.Activation('softmax'))
model.summary()
When I finish training I save the model using tf.keras.models.save_model :
num = time.time()
tf.keras.models.save_model(model,'./saved_models/' + str(num) + '/')
Then I transform this model into onnx format using "tf2onnx" module:
! python -m tf2onnx.convert --saved-model saved_models/1645088924.84102/ --output 1645088924.84102.onnx
I want a method that can retrieve the same model into tensorflow2.x. I tried to use "onnx_tf" to transform the onnx model into tensorflow .pb model:
import onnx
from onnx_tf.backend import prepare
onnx_model = onnx.load("1645088924.84102.onnx") # load onnx model
tf_rep = prepare(onnx_model) # prepare tf representation
But this method generates a .pb file only, but the load_model method in tensorflow2.x requires two additional folders in the same directory as the .pb file which are named as "variables" and "assets".
If there is a way to make the .pb file work as if it has the "assets" and "variables" folders, or if there is a method that can generate complete model from onnx, either solutions would be appreciated.
I'm using a jupyter hub server, and everything is inside anaconda environment.