4

I'm implementing a fully convolutional neural network for image segmentation by using unet defined here

https://github.com/zhixuhao

To give different weights to the pixels of different classes I defined an extra Lambda layer, as suggested here

Keras, binary segmentation, add weight to loss function

The problem is that Keras raises this error when saving the model

.....
self.model.save(filepath, overwrite=True)
.....
TypeError: ('Not JSON Serializable:', b'\n\x15clip_by_value/Minimum\x12\x07Minimum\x1a\x12conv2d_23/Identity\x1a\x17clip_by_value/Minimum/y*\x07\n\x01T\x12\x020\x01')

My network is defined in an external function

def weighted_binary_loss(X):
    y_pred, y_true, weights = X
    loss = binary_crossentropy(y_true, y_pred)
    weights_mask = y_true*weights[0] + (1.-y_true)*weights[1]
    loss = multiply([loss, weights_mask])
    return loss    

def identity_loss(y_true, y_pred):
    return y_pred


def net()
.....
....
conv10 = Conv2D(1, 1, activation = 'sigmoid')(conv9)
w_loss = Lambda(weighted_binary_loss, output_shape=input_size, name='loss_output')([conv10, inputs, weights])
model = Model(inputs = inputs, outputs = w_loss)
model.compile(optimizer = Adam(lr = 1e-5), loss = identity_loss, metrics = ['accuracy'])

that I call in my main function

...
model_checkpoint = ModelCheckpoint('temp_model.hdf5', monitor='loss',verbose=1, save_best_only=True)
model.fit_generator(imgs,steps_per_epoch=20,epochs=1,callbacks=[model_checkpoint])

When I erase the Lambda layer, the error desappears

...
conv10 = Conv2D(1, 1, activation = 'sigmoid')(conv9)
model = Model(inputs = inputs, outputs = conv10)
model.compile(optimizer = Adam(lr = 1e-5), loss = 'binary_crossentropy', metrics = ['accuracy'])

I'm using Keras==2.2.4, tensorflow-gpu==2.0.0b1

Luke83
  • 87
  • 5

1 Answers1

1

It appears that you are computing the loss in the layer of a model. It is not a good practice to accomodate the loss function as a layer. You can compute your weighted loss using custom loss function.

So your code can be rewritten as follows:

def weighted_binary_loss(y_true, y_pred):
    weights = [0.5, 0.6]  # Define your weights here
    loss = binary_crossentropy(y_true, y_pred)
    weights_mask = y_true*weights[0] + (1.-y_true)*weights[1]
    loss = multiply([loss, weights_mask])
    return loss  

conv10 = Conv2D(1, 1, activation = 'sigmoid')(conv9)
model = Model(inputs = inputs, outputs = conv10)
model.compile(optimizer = Adam(lr = 1e-5), loss = weighted_binary_loss, metrics = ['accuracy'])

If it is needed that weights is a dynamic property and you have to send it as a separate parameter in loss function, you can follow this question.

Prasad
  • 5,946
  • 3
  • 30
  • 36