How is the accuracy calculated when the problem is a regression one?
I'm working on a regression problem to predict how much electricity each user USES each day,I use keras build a LSTM model to do this time series prediction. At the beginning, I use the 'accuracy' as the metrics, and when run
model.fit(...,verbose=2,...)
val_acc
has a value after every epoch. And in my result, the value doesn't change, it's always the same value.
Then I realized that the regression problem was that there was no concept of accuracy, and then I started to wonder, how is that accuracy calculated?
I have a guess that when metrics is 'accuracy' in the regression question, accuracy is also calculated in a similar way to the classification problem: the number of predicted values equal to true values divided by the total sample size.
Am I right?