2

For regression NN using Keras, I hope to calculate Mahalanobis distance. But it looks there's no built-in yet. So I hope to play with custom loss function and I hope to ask a few questions.

Is there a way I can print the custom loss function's output. Not using verbose=1 or something but, using a simple call so I can check the calculation. If so, what type of values should I pass for y_pred and y_true, NumPy?

If Mahalanobis works, I hope to output the Cholesky decomposition of the covariance. But then, the neural net's output will have 6 more values than the label. Is there a clever way to avoid the assertion error that the net's output and the label's dimension should be the same?

In the custom loss function, does the tensor's index work like NumPy? For instance, if I want a partial sum, of y_pred, can do I K.sum(y_pred[3:5, :])? Well actually, if Q1 can be answered, I can try these by my self.

zero323
  • 322,348
  • 103
  • 959
  • 935
user9200689
  • 129
  • 1
  • 6
  • 1
    _and I hope to ask a few questions._ - please don't. The more focused the question, the better, and in general - single question, single problem. – zero323 Jan 20 '18 at 12:57
  • Your first question is answered [here](https://stackoverflow.com/a/46863768/1531463). – Yu-Yang Jan 20 '18 at 16:30

1 Answers1

0

In your custom loss you should consider y_true and y_pred to be tensors (tensorflow tensors if you are using tf as backend).

If you want to perform custom computation, you have to use the backend:

import keras.backend as K

Here you can use K.sum, K.abs, K.mean, ... (more or less in numpy style).

When you are actually feeding your model some data, you will pass it as numpy arrays.

rickyalbert
  • 2,552
  • 4
  • 21
  • 31