0

I am new to Torch and I want to create a custom loss function in Torch which is a modification of ClassNLLCriterion. Concretely, ClassNLLCriterion loss is:

loss(x, class) = -x[class]

I want to modify this to be:

loss(x, class) = -x[class] + F(x)

where F(x) is a function that looks x up in a table (as a key) and outputs its value.

My question is, what's the correct way of implementing this custom criterion? The updateOutput() function seems straightforward, but how do I implement the updateGradInput() function?

braindead
  • 97
  • 2
  • 7
  • 1
    what kind of table is `F`? It does not seem to be differentiable... – fonfonx May 18 '17 at 15:36
  • @fonfonx You are right, `F` is not differentiable. What's the best strategy in this case? Does it make sense to treat `F(x)` as a constant? – braindead May 18 '17 at 15:58
  • I don't know what ou wanna do with this `F`, and how much its values vary. I guess you could try to treat `F` as a constant but you somehow loose the utility of this `F` function I think. Maybe you could try to find a differentiable function approximating `F`... – fonfonx May 18 '17 at 18:25

1 Answers1

0

If F(x) is not differentiable with respect to the network parameters, then you can't use it within the loss function. Differentiability is a necessary condition to perform gradient descent during backpropagation. See Non-smooth and non-differentiable customized loss function tensorflow.

braindead
  • 97
  • 2
  • 7