Is there any difference between relu as an activation function or a layer? For example
Conv2D(filters=8, kernel_size=(3, 3), activation='relu',padding='SAME', name='conv_2')
or
Conv2D(filters=8, kernel_size=(3, 3),padding='SAME', name='conv_2'),
ReLU()