Good Morning,Afternoon,and Evening. I'm new to TF and ML right now, learning things from beautiful book Handsonml2.
I'm at the sub-class model page right now, but something awkward happened.
Usually, when we are training our model, we see this. normally-
tqdm prints every epochs, showing progress greatly.
however, when I use sub-class model, then things go like this. TQDM surges
It seems like something in the class is interrupting the tqdm, making it prints as much as it can. these are my codes for the model class.
class WideAndDeepModel(keras.models.Model):
def __init__(self, units=30, activation="relu", **kwargs):
super().__init__(**kwargs)
self.hidden1 = keras.layers.Dense(units, activation=activation)
self.hidden2 = keras.layers.Dense(units, activation=activation)
self.main_output = keras.layers.Dense(1)
self.aux_output = keras.layers.Dense(1)
def call(self, inputs):
input_A, input_B = inputs
hidden1 = self.hidden1(input_B)
hidden2 = self.hidden2(hidden1)
concat = keras.layers.concatenate([input_A, hidden2])
main_output = self.main_output(concat)
aux_output = self.aux_output(hidden2)
return main_output, aux_output
model = WideAndDeepModel(30, activation="relu")
it makes odd response like pic #2, while it does not make any problem if I use almost same APIs but do it in functions.
It's like
input_A = keras.layers.Input(shape=[5], name="wide_input")
input_B = keras.layers.Input(shape=[6], name="deep_input")
hidden1 = keras.layers.Dense(30, activation="relu")(input_B)
hidden2 = keras.layers.Dense(30, activation="relu")(hidden1)
concat = keras.layers.concatenate([input_A, hidden2])
output = keras.layers.Dense(1, name="output")(concat)
aux_output = keras.layers.Dense(1, name="aux_output")(hidden2)
model = keras.models.Model(inputs=[input_A, input_B], outputs=[output])
Does anybody knows why? Is it possible to prevent? I'll wait for the answer.
*It does not make that results if I erase the Aux output parts. Yet, I want to know why it is happening...