For regression NN using Keras, I hope to calculate Mahalanobis distance. But it looks there's no built-in yet. So I hope to play with custom loss function and I hope to ask a few questions.
Is there a way I can print the custom loss function's output. Not using verbose=1
or something but, using a simple call so I can check the calculation. If so, what type of values should I pass for y_pred
and y_true
, NumPy?
If Mahalanobis works, I hope to output the Cholesky decomposition of the covariance. But then, the neural net's output will have 6 more values than the label. Is there a clever way to avoid the assertion error that the net's output and the label's dimension should be the same?
In the custom loss function, does the tensor's index work like NumPy? For instance, if I want a partial sum, of y_pred
, can do I K.sum(y_pred[3:5, :])
? Well actually, if Q1 can be answered, I can try these by my self.