I have a neural network for regression prediction means that the output is a real value number in range 0 to 1.
I used drop out for all layers and the errors suddenly increased and never converged.
Is drop out usable for regression task? Because if we disregard some nodes, the last layer will have fewer nodes and the predicted value will definitely very different from the actual value. So the back propagated error will be large and the model will be destroyed. Then Why should we use drop out for regression task in neural networks?