0

I am training a VGG-16 model toward a multi-class classification task with Tensorflow 2.4, Keras 2.4.0 versions. The y-true labels are one-hot encoded. I use a couple of custom loss functions, individually, to train the model. First, I used a custom cauchy-schwarz divergence loss function as shown below:

from math import sqrt
from math import log
from scipy.stats import gaussian_kde
from scipy import special

def cs_divergence(p1, p2):    
    """p1 (numpy array): first pdfs, p2 (numpy array): second pdfs, Returns:float: CS divergence"""    
    r = range(0, p1.shape[0])
    p1_kernel = gaussian_kde(p1)
    p2_kernel = gaussian_kde(p2)
    p1_computed = p1_kernel(r)
    p2_computed = p2_kernel(r)
    numerator = sum(p1_computed * p2_computed)
    denominator = sqrt(sum(p1_computed ** 2) * sum(p2_computed**2))
    return -log(numerator/denominator)

Then, I used a negative log likelihood custom loss function as shown below:

def nll(y_true, y_pred):
    loss = -special.xlogy(y_true, y_pred) - special.xlogy(1-y_true, 1-y_pred)
    return loss

And compiled the models as below during training the models individually with these losses:

sgd = SGD(lr=0.0001, decay=1e-6, momentum=0.9, nesterov=True)  
model_vgg16.compile(optimizer=sgd,
              loss=[cs_divergence], 
              metrics=['accuracy'])

and

sgd = SGD(lr=0.0001, decay=1e-6, momentum=0.9, nesterov=True)  
model_vgg16.compile(optimizer=sgd,
              loss=[nll], 
              metrics=['accuracy'])

I got the following errors when training the model with these loss function: With cs_divergence, I got the following error:

TypeError: 'NoneType' object cannot be interpreted as an integer

With nll custom loss, I got the following error:

NotImplementedError: Cannot convert a symbolic Tensor (IteratorGetNext:1) to a numpy array. This error may indicate that you're trying to pass a Tensor to a NumPy call, which is not supported

I downgraded the Numpy version to 1.19.5 as discussed in NotImplementedError: Cannot convert a symbolic Tensor (2nd_target:0) to a numpy array but it didn't help.

shiva
  • 1,177
  • 2
  • 14
  • 31

1 Answers1

0

try maybe:

loss=cs_divergence
means without the Brackets.
Niv Dudovitch
  • 1,614
  • 7
  • 15