1

(I am sorry if my English is not good)

I can create my own loss function in PyTorch if the function requires only DNN output vector(predicted) and DNN output vector(ground truth).

I want to use additional variables to calculate the loss.

I make my training and test data like below;

DNN input:

  1. Data_A -> processing 1 -> Data_X

DNN output:

  1. Data_A -> processing 1 -> Data_X
  2. Data_B -> processing 1 -> Data_P
  3. Data_X , Data_P -> processing 2 -> Data_Y

and I divide Data_X and Data_Y into train data and test data. x_train, x_test, y_train, y_test = train_test_split(Data_X,Data_Y,test_size=0.2, random_state=0)

I want to use Data_A, Data_B, Data_Y(predicted), and Data_Y(ground truth) to calculate the loss. I saw many examples for customized loss function which only use Data_Y(predicted) and Data_Y(ground truth). I could use such a customized loss function before. However, I don't know what to do when I want to use another additional variables. Is there a good way? Thank you for your help!

hac81acnh
  • 82
  • 7
  • 1
    Your naming is a bit confusing and your question seems very broad: you have two models and use outputs from the first one to feed the second one. You want to compute a loss (what loss?) based on the inputs from `Data_A` and `Data_B`, the output of the model, and the ground truth. You can implement such a function, yes. However, without further details it's hard to help. – Ivan Jan 31 '21 at 14:21
  • Thank you for your comment! I see, I will show further details. `Data_A` and `Data_B` are complex vectors. `Processing 1` is `torch.abs()`. Therefore `Data_X = torch.abs(Data_A)` and `Data_P = torch.abs(Data_B)`. `Processing 2` consists of `torch.pow()` and `torch.div()`. In detail, `Data_Y = torch.div(torch.pow(torch.abs(Data_A),2), torch.pow(torch.abs(Data_B),2)`. The DNN input is `Data_X`. The DNN output is `Data_Y`. I want to calculate the loss by `torch.mean(torch.pow(Data_Y * Data_A - Data_B ,2))`. – hac81acnh Feb 01 '21 at 09:26
  • 1
    As long as you use PyTorch operators you should be fine, see Shai's answer below to create a custom loss function. – Ivan Feb 01 '21 at 09:30
  • OK. I try to creats a custom function. Thank you for your help!! – hac81acnh Feb 01 '21 at 15:15

1 Answers1

2

You have no restrictions over the structure of your loss function (as long as the gradients make sense).
For instance, you can have:

class MyLossLayer(nn.Module):
  def __init__(self):
    super(MyLossLayer, self).__init__()

  def forward(self, pred_a, pred_b, gt_target):
    # I'm just guessing here - do whatever you want as long as you do not screw the gradients.
    loss = pred_a * (pred_b - target)
    return loss.mean()
Shai
  • 111,146
  • 38
  • 238
  • 371
  • Thank you for your comment ! I see. Actually I want to do like this; `Data_A` and `Data_B` are complex vectors. `Processing 1` is `torch.abs()`. Therefore `Data_X = torch.abs(Data_A)` and `Data_P = torch.abs(Data_B)`. `Processing 2` consists of `torch.pow()` and `torch.div()`. In detail, `Data_Y = torch.div(torch.pow(torch.abs(Data_A),2), torch.pow(torch.abs(Data_B),2)`. The DNN input is `Data_X`. The DNN output is `Data_Y`. I want to calculate the loss by `torch.mean(torch.pow(Data_Y * Data_A - Data_B ,2))`. – hac81acnh Feb 01 '21 at 14:49
  • If I try to do like this in your code, `pred_a` is DNN output while `gt_target` is DNN ground truth output, and `pred_b1` is an additional variables which is equal to `Data_A`, `pred_b2` is also an additional variables which is equal to `Data_B`, And `loss = torch.mean(torch.pow(pred_a * pred_b1 - *pred_b2 ,2))`. – hac81acnh Feb 01 '21 at 14:51
  • And in this case, I have a question about additional variables. I think I don't have to care about `pred_a` or `gt_target`, they are given to the loss function as usual. However, when I use additional variables, I need to feed them to the loss function by hand. However, in the training session, loss function is calculated by batch by batch. I wonder how I know which data (index) is chosen to the current batch. Do you have any ideas? Thank you very much. – hac81acnh Feb 01 '21 at 14:58
  • 1
    I use your code and it works!! Thank you very much. – hac81acnh Feb 03 '21 at 04:34