2

I need to train a model with a custom loss function, which shall also update some external function right after the prediction, like this:

def loss_fct(y_true, y_pred):
    global feeder
 
    # Change values of feeder given y_pred
    for value in y_pred:
        feeder.do_something(value)
    
    return K.mean(y_true - y_pred, axis=-1)

However this doesn't work, as TF cannot iterate through tensors in AutoGraph:

OperatorNotAllowedInGraphError: iterating over `tf.Tensor` is not allowed: AutoGraph did convert this function. This might indicate you are trying to use an unsupported feature.

My model looks like this

model = Sequential()
model.add(Input(shape=(DIM, )))
model.add(Dense(DIM, activation=None))
model.add(Dense(16, activation=None))
model.add(Dense(4, activation="softmax"))
model.compile(optimizer="adam", loss=loss_fct)
model.summary()

And it is trained like this:

model.fit(x=feeder.feed,
    epochs=18,
    verbose=1,
    callbacks=None,
)

Where feeder.feed is a generator yielding 2 NumPy arrays.


St Ax
  • 73
  • 1
  • 8
  • First, check which values your generator yields. `y_pred` seems to be a 4 dim tensor. So, what do you expect with iteration over it to happen? You can iterate indirectly, like something `y_true[0]` etc by using the tensor's shape also – Eypros Oct 25 '20 at 19:59
  • Yes you are right. ```y_pred``` can be for example ```[1, 0, 1, 0]```. However I cannot do ```y_true[0]``` while the program is running inside Autograph – St Ax Oct 26 '20 at 00:08

1 Answers1

1

After a lot of research, I have came across this answer. It seems that nothing is wrong with the approach, but it's rather a Tensorflow >= 2.2.0 bug, where Eager Execution is enabled by default.

Finally, to solve this problem use model.compile(..., run_eagerly=True) and iteration and access to tensors during training will be available.

St Ax
  • 73
  • 1
  • 8