0

So these are my loss per 75 epochs:

Epoch: 75, loss: 47382825795584.000000
Epoch: 150, loss: 47382825795584.000000
Epoch: 225, loss: 47382825795584.000000
Epoch: 300, loss: 47382825795584.000000
Epoch: 375, loss: 47382825795584.000000
Epoch: 450, loss: 47382825795584.000000
Epoch: 525, loss: 47382825795584.000000
Epoch: 600, loss: 47382825795584.000000
Epoch: 675, loss: 47382825795584.000000
Epoch: 750, loss: 47382825795584.000000

And these are the values from predictions and targets respectively

Predictions: tensor([[ 8109436.0000,  7734814.0000,  8737677.0000, 11230861.0000,
          3795826.7500,  3125072.7500,  1699706.1250,  5337285.0000,
          3474238.5000]], grad_fn=<TBackward>)
----------------------------------------
Targets: tensor([[ 8111607.,  7580798.,  8749436., 11183578.,  3822811.,  3148031.,
          2343278.,  5360924.,  3536146.]])

And this is the accuracy of the first, and second elements inside predictions against the first, and second elements of targets

8109436.0000/8111607*100 #First element
Output: 99.9732358828528

print(7734814.0000/7580798*100) #Second element
Output: 102.03165946381898

So I'm really not sure what is going on. Because I have a large loss there is a 99% accuracy for the first element and 98% accuracy on the second element? I'm not the best at math, so I'm not sure about the last percentage.

Could someone explain if the loss reflects the accuracy?

Ry-
  • 218,210
  • 55
  • 464
  • 476
YJH16120
  • 419
  • 1
  • 4
  • 15
  • what is the specific loss function used? – Gilad Green Aug 24 '20 at 10:50
  • Your real issue here is not the "big" value of the loss, but the fact that your model does not seem to learn (the loss is constant). Difficult to say more without further details. The answers in [Loss & accuracy - Are these reasonable learning curves?](https://stackoverflow.com/questions/47817424/loss-accuracy-are-these-reasonable-learning-curves) and [How does Keras calculate the accuracy?](https://stackoverflow.com/questions/47508874/how-does-keras-calculate-the-accuracy) disclaimer: mine) may be helpful to gett he general picture. – desertnaut Aug 24 '20 at 11:57

2 Answers2

2

Loss is only meaningful relatively (i.e. for comparison). Multiply your loss function by 10 and your loss is 10 times bigger on the same model. This doesn't tell you anything.

But using the same loss function, if model_1 gives a loss 10x smaller than model_2, then chances are model_1 will have better accuracy (although not 100% guarantied).

Julien
  • 13,986
  • 5
  • 29
  • 53
0

No, loss does not reflect accuracy. In your case, you should be using another metric to quantify your accuracy. Since you're using continuous target variables, you could be using a metric like the mean squared error (MSE). Be careful with this, since the MSE metric assumes the underlying data is normally distributed. In any case, loss is relative and totally depends on the loss function that you use in your optimization. A large loss does not imply a bad accuracy/MSE.

metahexane
  • 119
  • 11