0

I'm working on binary classification problem and I want to implement a custom loss function but I'm not familiar with creating symbolic functions.

I know that the loss function should take y_pred and y_true and what eventually I want to calculate is as follows in pseudo code

winning_trades = 0;
losing_trades = 0;
for(int i=0;i<len(y_pred);i++)
{
 if((y_pred[i] == 1) && (y_pred[i] == y_true[i]))
 {
   winning_trades++;
 }
 else if((y_pred[i] == 1) && (y_test[i] == 0))
 {
   losing_trades++;
 }
}
loss = losing_trades/(winning_trades+losing_trades);

loss shall be the return of the loss function. How do I implement the equivalent of this algorithm in keras "losses.py"

desertnaut
  • 57,590
  • 26
  • 140
  • 166
JKnot
  • 1
  • 1
    You don't. Your loss function has no meaningful gradient, so your network won't train. – Kit. Dec 15 '18 at 19:25
  • 1
    You are a little confused; what you want actually is a custom *metric*, and not a custom loss; for the distinction between these two, see [Loss & accuracy - Are these reasonable learning curves?](https://stackoverflow.com/questions/48775305/what-function-defines-accuracy-in-keras-when-the-loss-is-mean-squared-error-mse/48788577#48788577) and [What function defines accuracy in Keras when the loss is mean squared error (MSE)?](https://stackoverflow.com/questions/48775305/what-function-defines-accuracy-in-keras-when-the-loss-is-mean-squared-error-mse/48788577#48788577) – desertnaut Dec 15 '18 at 20:43

0 Answers0