1

I would implement custom Keras loss function for focal loss which explained in this paper; however, I have to get the weight of the trainable layers. So, is there any way to read the weight of the layer during the training inside the loss function?

def focal_coef(model, y_true, y_pred, content, label_remap, gamma_focal=2, w_d=1e-4):
    w = np.empty(len(content))
    f = np.empty(len(content))
    for key in content:
        e = 0.001
        f_c = content[key]
        f[label_remap[key]] = f_c
        w[label_remap[key]] = 1 / (f_c + e)
    median_freq = np.median(f)
    print("\nFrequencies of classes:\n", f)
    print("\nMedian freq:\n", median_freq)
    print("\nWeights for loss function (1/frec(c)):\n", w)
    w = median_freq * w
    print("\nWeights for loss function (median frec/frec(c)):\n", w)
    w_mask = w.astype(np.bool).astype(np.float32)
    print("w_mask", w_mask)
    softmax = k.softmax(y_pred)
    softmax_mat = k.reshape(softmax, (-1, len(content)))
    zerohot_softmax_mat = 1 - softmax_mat
    # make the labels one-hot for the cross-entropy
    print("datatype", y_true.type)
    onehot_mat = k.reshape(k.one_hot(y_true, len(content)),
                       (-1, len(content)))
    # make the zero hot to punish the false negatives, but ignore the
    # zero-weight classes
    masked_sum = k.reduce_sum(onehot_mat * w_mask, axis=1)
    zeros = onehot_mat * 0.0
    zerohot_mat = k.where(k.less(masked_sum, 1e-5),
                      x=zeros,
                      y=1 - onehot_mat)
    # focal loss p and gamma
    loss_epsilon = 1e-10
    gamma = np.full(onehot_mat.get_shape().as_list(), fill_value=gamma_focal)
    gamma_tf = k.constant(gamma, dtype=k.float32)
    focal_softmax = k.pow(1 - softmax_mat, gamma_tf) * \
                k.log(softmax_mat + loss_epsilon)
    zerohot_focal_softmax = k.pow(1 - zerohot_softmax_mat, gamma_tf) * \
                        k.log(zerohot_softmax_mat + loss_epsilon)

    # calculate xentropy
    cross_entropy = - k.reduce_sum(k.multiply(focal_softmax * onehot_mat +
                                          zerohot_focal_softmax * zerohot_mat, w),
                               axis=[1])
    loss = k.reduce_mean(cross_entropy, name='xentropy_mean')
    for layer in model.layers:
        if layer.trainable == True:
            loss += w_d * (k.sum(layer ** 2) / 2)
    return loss

def focal_loss(model, content, label_remap, gamma_focal=2, w_d=1e-4):
    def focal(y_true, y_pred):
        return -focal_coef(model, y_true, y_pred, content, label_remap, gamma_focal, w_d)

    return focal
Milo Lu
  • 3,176
  • 3
  • 35
  • 46
Zaher88abd
  • 448
  • 7
  • 20
  • 2
    Please limit the scope of your question and provide a code sample/first attempt. – Mateen Ulhaq Oct 21 '18 at 23:01
  • Why do you need to get the weights of all layers? This isn't needed for the focal loss. – Dr. Snoopy Oct 22 '18 at 20:48
  • I am implementing a custom focal loss function. Also, I have another issue where y_true should have a different shape from y_pred. y_true.shape=(n,h,w) y_pred.shape=(n,c,h,w) – Zaher88abd Oct 24 '18 at 13:58

1 Answers1

0

Your custom loss function must have two arguments these are y_true and y_pred, if you want more parameters you can call a function inside of your loss function to get the parameters, since you want weights from layers, your custom loss function must have access to your model

model = ...

def custom_loss(y_true, y_pred):
    w1 = model.get_layers[0].get_weights()[0] # weights of the first layer
    b1 = model.get_layers[0].get_weights()[1] # bias of the first layer
    w2 = model.get_layers[1].get_weights()[0] # weights of the second layer
    .
    .
    .
    loss = ...
    return loss
model.compile(loss=custom_loss, optimizer = ...)
model.fit(...)

If you want all of the weights you can domodel.get_weights()

  • Thank @MeteHanKahraman for the answer, and I actually I made my custom function depend on this answer[https://stackoverflow.com/questions/45961428/make-a-custom-loss-function-in-keras], and when I call my function I call it like this ```model_loss = focal_loss(model=model, content=traning_content_perc, label_remap=label_remap) model.compile(loss=model_loss, optimizer=adam, metrics=['acc'])``` That working for me but still get some exptions. Please, let my know if I am doing something wrong. – Zaher88abd Oct 22 '18 at 18:43