4

I am writing just a simple loss function in which I have to convert the tensor to numpy array(it's essential). I am just trying to print value of the tensor but I am getting this error:-

Tensor("loss/activation_4_loss/Print:0", shape=(?, 224, 224, 2), dtype=float32)

def Lc(y_true, y_pred):
    x=K.print_tensor(y_pred)
    print(x)
    return K.mean(y_pred)

Kindly tell me that how can I get the value(numerics) from the tensor? I also tried "eval" but it also threw a big fat error about no session is there and it is a placeholder etc. The whole program is executing fine, just "print_tensor" line is causing problem.

Mikhail Zhuikov
  • 1,213
  • 2
  • 9
  • 19
Asim
  • 1,430
  • 1
  • 22
  • 43
  • The `K.print_tensor()` seems to return a Print Tensor object. Simply follow the basic steps : `print( tf.Session().run(x) )` – Shubham Panchal Mar 06 '19 at 12:25
  • Also, if necessary, you can enable the eager mode in TensorFlow so that the values of Tensors are printed. – Shubham Panchal Mar 06 '19 at 12:27
  • Your premise is wrong, no loss function in Keras will work if you use numpy as part of it, because there is no way to propagate gradients through numpy code. – Dr. Snoopy Mar 06 '19 at 12:34
  • @ShubhamPanchal I am getting error of you must feed value, when doing tf.session().run(x) in the keras custom loss. – Asim Mar 06 '19 at 12:47

1 Answers1

3

The print statement is redundant. print_tensor will already print the values.

From the documentation of print_tensor:

"Note that print_tensor returns a new tensor identical to x which should be used in the following code. Otherwise the print operation is not taken into account during evaluation."

In the code above, since y_pred was assigned to x and x was no longer used, the print failed.

Use the version below.

def Lc(y_true, y_pred):
    y_pred=K.print_tensor(y_pred)
    return K.mean(y_pred)

def cat_loss(y_true, y_pred):
    y_pred = K.print_tensor(y_pred)
    return K.categorical_crossentropy(y_true, y_pred)

After I put this cat_loss function in my training loop, I can see the output like this:

[[0.000191014129 0.230871275 0.43813318]...]

190/255 [=====================>........] - ETA: 0s - loss: 0.3442 - acc: 0.9015

[[3.16367514e-05 1.70419597e-07 0.000147014405]...]

Manoj Mohan
  • 5,654
  • 1
  • 17
  • 21