1

In this pytorch neural network tutorial tutorial link

I'm confuse why we need to use relu before max pooling.
Isn't the pixel values in the image are already positive?
I don't know why relu max(0, x) is needed.
Can anybody give me some advise on this issue?

 class Net(nn.Module):
  ...(init function)

   def forward(self, x):
     x = F.max_pool2d(F.relu(self.conv1(x)), (2, 2)) # Max pooling over a (2, 2) window
paul-shuvo
  • 1,874
  • 4
  • 33
  • 37
HandHand
  • 45
  • 1
  • 5
  • Does this answer your question? [Activation function after pooling layer or convolutional layer?](https://stackoverflow.com/questions/35543428/activation-function-after-pooling-layer-or-convolutional-layer) – Dishin H Goyani Jan 17 '20 at 07:43

1 Answers1

1

The weights of the neural net can be negative thus you can have a negative activation and by using the relu function, you're only activating the nodes that serve the purpose.

paul-shuvo
  • 1,874
  • 4
  • 33
  • 37
  • 1. It is possible that weight are negatives but output of convolutional layer are positives. 2. There is no difference between the order of max-pooling layer and relu layer. Correct me if i'm wrong. – khanh Jan 18 '20 at 07:56
  • 2. you're right but for 1. that depends on the weights and the input to the layer. – paul-shuvo Jan 18 '20 at 20:00