0

I am trying to apply data augmentation for a binary image classification problem in the following way as mentioned in tensorflow docs: https://www.tensorflow.org/tutorials/images/classification#data_augmentation

My model is this:

Sequential([
  data_augmentation,
  layers.experimental.preprocessing.Rescaling(1./255),
  layers.Conv2D(16, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Dropout(0.2),
  layers.Conv2D(32, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Dropout(0.2),
  layers.Conv2D(64, 3, padding='same', activation='relu'),
  layers.MaxPooling2D(),
  layers.Flatten(),
  layers.Dense(128, activation='relu'),
  layers.Dropout(0.5),
  layers.Dense(1, activation='sigmoid')
])

When my data augmentation layer is like this, the model compiles without error:

data_augmentation = keras.Sequential(
  [
    layers.experimental.preprocessing.RandomFlip("horizontal", 
                                                 input_shape=(150, 
                                                              150,
                                                              3)),
    layers.experimental.preprocessing.RandomRotation(0.2),
    layers.experimental.preprocessing.RandomZoom(0.2)
  ]
)

If I try to introduce RandomHeight() and/or RandomWidth() in my augmentation layer, I receive the following error when creating the model:

ValueError: The last dimension of the inputs to `Dense` should be defined. Found `None`.

Any idea as to why this is happening and how to resolve it?

viperjonis
  • 40
  • 1
  • 7

1 Answers1

0

You can check what RandomWidth-Height outputs as the shape.Source code of RandomWidth class:

 return tensor_shape.TensorShape(
        [input_shape[0], None, input_shape[2], input_shape[3]])

Suppose I used RandomHeight as a first layer and input_shape as 150 x 150 RGB images. We can confirm output shape by:

data_augmentation.summary()
Layer (type)                 Output Shape              Param #   
=================================================================
random_height_2 (RandomHeigh (None, None, 150, 3)      0         
_________________________________________________________________
random_flip_2 (RandomFlip)   (None, None, 150, 3)      0         
_________________________________________________________________
random_rotation_2 (RandomRot (None, None, 150, 3)      0         
_________________________________________________________________
random_zoom_2 (RandomZoom)   (None, None, 150, 3)      0         

When you use it like this and if you compile your model without dense layers you will see in model summary:

dropout_6 (Dropout)          (None, None, 18, 64)      0         
_________________________________________________________________
flatten_6 (Flatten)          (None, None)              0         

(None,None) is causing the error here. You can solve it by using tf.keras.layers.GlobalMaxPooling2D() instead of Flatten()

While this solves the dimension problem caused by Flatten() layer, GlobalMaxPooling2D behaves a little bit differently.

You can check this question for differences.

Frightera
  • 4,773
  • 2
  • 13
  • 28
  • 1
    While it actually solves the dimension problem, the behaviour of `GlobalMaxPooling2D` is quite different from `Flatten`, and you might want at least link to that question: https://stackoverflow.com/questions/49295311/what-is-the-difference-between-flatten-and-globalaveragepooling2d-in-keras – Lescurel Mar 05 '21 at 09:17
  • Yes I know, I should have mentioned about it, thank you @Lescurel – Frightera Mar 05 '21 at 09:33
  • So there is no way to keep `Flatten` if I include the Height/Width augmentations in the model then? Bit of a letdown as augmentation via the `ImageDataGenerator` is much less picky with model architecture, guess that is why these augmentations are still under the `experimental` library in tensorflow. – viperjonis Mar 05 '21 at 12:22
  • I guess no, flattening will cause a dimension error. But maybe you can create your own augmention layers with tf.image using subclassing. – Frightera Mar 05 '21 at 13:24