I'm working on binary classification problem and I want to implement a custom loss function but I'm not familiar with creating symbolic functions.
I know that the loss function should take y_pred and y_true and what eventually I want to calculate is as follows in pseudo code
winning_trades = 0;
losing_trades = 0;
for(int i=0;i<len(y_pred);i++)
{
if((y_pred[i] == 1) && (y_pred[i] == y_true[i]))
{
winning_trades++;
}
else if((y_pred[i] == 1) && (y_test[i] == 0))
{
losing_trades++;
}
}
loss = losing_trades/(winning_trades+losing_trades);
loss shall be the return of the loss function. How do I implement the equivalent of this algorithm in keras "losses.py"