2

I need to now how data is padded in a 1d convolutional layer using Keras with Theano as backend. I use a "same" padding.

Assuming we have an output_length of 8 and a kernel_size of 4. According to the original Keras code we have padding of 8//4 == 2. However, when adding two zeros at the left and the right end of my horizontal data, I could compute 9 convolutions instead of 8.

Can somebody explain me how data is padded? Where are zeros added and how do I compute the number of padding values on the right and left side of my data?

null
  • 1,369
  • 2
  • 18
  • 38

1 Answers1

5

How to test the way keras pads the sequences:

A very simple test you can do is to create a model with a single convolutional layer, enforce its weights to be 1 and its biases to be 0, and give it an input with ones to see the output:

from keras.layers import *
from keras.models import Model
import numpy as np


#creating the model
inp = Input((8,1))
out = Conv1D(filters=1,kernel_size=4,padding='same')(inp)
model = Model(inp,out)


#adjusting the weights
ws = model.layers[1].get_weights()

ws[0] = np.ones(ws[0].shape) #weights
ws[1] = np.zeros(ws[1].shape) #biases

model.layers[1].set_weights(ws)

#predicting the result for a sequence with 8 elements
testData=np.ones((1,8,1))
print(model.predict(testData))

The output of this code is:

[[[ 2.] #a result 2 shows only 2 of the 4 kernel frames were activated
  [ 3.] #a result 3 shows only 3 of the 4 kernel frames were activated
  [ 4.] #a result 4 shows the full kernel was used   
  [ 4.]
  [ 4.]
  [ 4.]
  [ 4.]
  [ 3.]]]

So we can conclude that:

  • Keras adds the padding before performing the convolutions, not after. So the results are not "zero".
  • Keras distributes the padding equally, and when there is an odd number, it goes first.

So, it made the input data look like this before applying the convolutions

[0,0,1,1,1,1,1,1,1,1,0]
Daniel Möller
  • 84,878
  • 18
  • 192
  • 214
  • Thanks. I am aware of the implications by padding="same". However, I need to know exactly how the padding looks like if we have a kernel with an even size. In tensorflow the padding is distributed alternating on both sides starting on the right side (see second answer here: http://stackoverflow.com/questions/37674306/what-is-the-difference-between-same-and-valid-padding-in-tf-nn-max-pool-of-t). – null May 16 '17 at 16:12
  • See the test at the end of my answer, it will surely answer your question. – Daniel Möller May 16 '17 at 17:08
  • Are you sure you can call the output of Conv2d? Your test example cannot be executed on my machine. – null May 16 '17 at 17:36
  • Sorry, I hadn't tested the code, it was supposed to be Conv1D. I tested now, and made weights be all 1 and bias all 0 to see the result better. – Daniel Möller May 16 '17 at 18:10
  • 1
    Thank you! Just a quick addition: As far as I know data is always padded before performing convolutions. This is no Keras feature, but the normal behavior of a convolutional layer. – null May 16 '17 at 21:07
  • Is it possible to make result symmetrical? – mrgloom Jan 10 '20 at 13:28
  • What do you mean, @mrgloom? As far as I understand and is demonstrated, they are symmetrical. If you use even sizes for the kernel, it's impossible to have perfect symmetry – Daniel Möller Jan 10 '20 at 14:09
  • I mean padding is not symmetrical and result is not symmetrical too when even kernel size is used, does it cause problems in practice? – mrgloom Jan 10 '20 at 14:46
  • No problem. You might, if you want, use custom padding and alternate the longer side, but I believe this is preciosism. – Daniel Möller Jan 10 '20 at 14:48
  • For `keras.__version__ == 2.5.0` I get a different result. For `Conv1D` the padding is `[0,1,1,1,1,1,1,1,1,0,0]` and for `Conv1DTranspose` it is `[0,0,1,1,1,1,1,1,1,1,0]`. – Olli Niemitalo Aug 10 '21 at 09:31