3

I need to create a loss function for Keras that works with only binary values. In wanted for to transform all the values greater than 0.5 to 1.0, so I did that:

def MyLoss(y_true, y_pred:
    y_true_f = K.flatten(y_true)
    y_pred_f = K.flatten(K.cast(K.greater(y_pred, 0.5), 'float32'))
    #y_pred_f = K.flatten(K.cast(y_pred > 0.5), 'float32')
    #y_pred_f = K.flatten(y_pred > 0.5)
    return K.sum(y_true_f * y_pred_f)

The code compiles, but later it generates the following error:

ValueError: None values not supported.

I also tried the commented lines, same error. If I don't try to modify the values using simply y_pred_f = K.flatten(y_pred), it runs.

What am I doing wrong?

How can I binarize a tensor?

Marcin Możejko
  • 39,542
  • 10
  • 109
  • 120
FiReTiTi
  • 5,597
  • 12
  • 30
  • 58

1 Answers1

3

The solution for binarizing my logistic Dense layer was to make a custom lambda function in the activation. (I am working on a semantic hashing autoencoder (Hinton)). A warning is thrown by Keras but it proved to work anyway. Earlier attempts threw error due to inability to differentiate a round function in the backpropagation phase's comptuation of the gradient derivative. (It was the old ValueError: None values not supported.) Somehow doing it in the activation instead of as a separate layer was the key here.

encoder_outputs = Dense(units=latent_vector_len, activation=k.layers.Lambda(lambda z: k.backend.round(k.layers.activations.sigmoid(x=z))), kernel_initializer="lecun_normal")(x)

The Real outputs normally in range 0 to 1 were transformed into 0 and 1 as shown.

# Look it works!

y = encoder_model.predict(x=x_in)
print(y)
>>> [[1. 0. 0. 1. 0. 1. 0. 0.]]

In other words this way did not work:

decoder_outputs_bin = k.layers.Lambda(lambda z: k.backend.round(z))(decoder_outputs) # ERR at training time ValueError: None values not supported.
Geoffrey Anderson
  • 1,534
  • 17
  • 25
  • Thank you for your answer, I'll test it. But since I've moved everything to PyTorch, it's much more practical to design layers. – FiReTiTi Jul 25 '18 at 20:12
  • You can write a binarize layer with the following operations: (x-0.5)/abs(x-0.5) this will be a differentiable layer since all the components are differentiable. – Andrew Louw Aug 06 '20 at 12:40