I have written my custom training loop using tf.GradientTape()
. My data has 2 classes. The classes are not balanced; class1
data contributes almost 80% and class2
contributes remaining 20%. Therefore in order to remove this imbalance I was trying to write custom loss function which will take into account this imbalance and apply the corresponding class weights and calculate the loss. i.e. I want to use the class_weights = [0.2, 0.8]
. I am not able to find similar examples.
However all the examples I am seeing are using model.fit approach where its easier to pass the class_weights
. I am not able to find out the example which uses class_weights
with custom training loop using tf.GradientTape
.
I did go through the suggestions of using sample_weight
, however I don't have the data where in I can specify the weights for samples, therefore my preference is to use class weight.
I am using BinaryCrossentropy
loss as loss function but I want to change the loss based on the class_weights
. That's where I am stuck, how to tell BinaryCrossentropy
to consider the class_weights
.
Is my approach of using custom loss function correct or there is better way to make use of class_weights
while training with custom training loop (not using model.fit
)?