2

I want to define my own loss function in keras , and the bce_loss multiply with a varible W . Actually, W has the same shape with the tensor bce_loss.

If I print the tensor bce_loss, maybe it can be shown as follow:

Tensor("loss_8/activation_14_loss/logistic_loss:0", shape=(?, 3), dtype=float32)

Now I don't know how to get the shape of bce_loss, and make the varible W has the the same of bce_loss.

My code:

def myLoss(y_true, y_pred):
    bce_loss = K.binary_crossentropy(y_true, y_pred)
    # want to get a variable W with the same shape of bce_loss
    # And W is initialized with normal distribution.
    val = np.random.normal(0, 0.05, size= bce_loss.size()) 
    W = keras.variable( val )
    return K.mean(self.W*bce_loss, axis = -1)
rosefun
  • 1,797
  • 1
  • 21
  • 33

1 Answers1

2

You can define your loss function like this:

from keras import backend as K
import numpy as np

def myLoss(y_true, y_pred):
    bce_loss = K.binary_crossentropy(y_true, y_pred)
    w = K.random_normal(K.shape(bce_loss), 0, 0.05)
    return K.mean(w * bce_loss, axis=-1)

y_t = K.placeholder((1,2))
y_p = K.placeholder((1,2))
loss = myLoss(y_t, y_p)
print(K.get_session().run(loss, {y_t: np.array([[1,1]]), y_p: np.array([[0.5, 0.2]])}))
Manoj Mohan
  • 5,654
  • 1
  • 17
  • 21
  • How to solve this error: `ValueError: Cannot convert a partially known TensorShape to a Tensor: (?, 3)` by using `bce_loss.shape` ? – rosefun Apr 29 '19 at 12:01
  • Sorry, K.shape should be used to get the shape. I updated the answer. – Manoj Mohan Apr 29 '19 at 13:05
  • Static vs dynamic shape: https://stackoverflow.com/questions/37096225/how-to-understand-static-shape-and-dynamic-shape-in-tensorflow – Manoj Mohan Apr 29 '19 at 13:09