I am doing deep learning using a multi-layer perceptron for regression. The loss curve turns flat in the third epoch however accuracy curve remains flat at the beginning. I wonder whether this makes sense.
Asked
Active
Viewed 892 times
0
-
This can be a result of under-fitting. Try to increase layers (making net denser). – Tushar Gupta Sep 12 '19 at 06:28
-
Possible duplicate of [Loss & accuracy - Are these reasonable learning curves?](https://stackoverflow.com/questions/47817424/loss-accuracy-are-these-reasonable-learning-curves) – desertnaut Sep 12 '19 at 15:09
-
@TusharGupta Thank you for the reply. Actually I added another two layers and it is still like this. Do you know how keras calculate this "accuracy" plot? – Zheng Feng Oct 08 '19 at 17:01
-
@desertnaut Thank you for the reply but I think it is not. – Zheng Feng Oct 08 '19 at 17:01
1 Answers
0
Since you didn't provide the code, it would be harder to narrow down what is the problem. Being said, here are some pointers that might help you see what is the problem:
- Validation set is either small or it is a bad representation of your training set. (bear in mind, if you are using
validation_split
infit
function, then keras will only take the last percentage of your training set and will keep it the same for all epochs. link]). - You are not using any regularization (Dropout, Regularization, Constraints).
- The model could be small (layers- and neurons-wise), so it is underfitting.
Hope these pointers help you with your problem.

Coderji
- 7,655
- 5
- 37
- 51