I want to create a custom objective function for training a Keras deep net. I'm researching classification of imbalanced data, and I use the F1 score a lot in scikit-learn. I therefore had the idea of inverting the F1 metric (1 - F1 score) to use it as a loss function/objective for Keras to minimise while training:
(from sklearn.metric import f1_score)
def F1Loss(y_true, y_pred):
return 1. - f1_score(y_true, y_pred)
However, this f1_score
method from scikit-learn requires numpy arrays or lists to calculate the F1 score. I found that Tensors need to be evaluated to their numpy array counterparts using .eval()
, which requires a TensorFlow session to perform this task.
I do not know the session object that Keras uses. I have tried using the code below, assuming the Keras backend has its own session object defined somewhere, but this also did not work.
from keras import backend as K
K.eval(y_true)
Admittedly, this was a shot in the dark since I don't really understand the deeper workings of Keras or Tensorflow a the moment.
My question is: how do I evaluate the y_true
and y_pred
tensors to their numpy array counterparts?