0

I am training a 3D UNET neural network, but however long I train, my training metric (mean Intersection over Union) remains constant at .4123, but my loss metric decreases constantly as it trains longer.

IoU remains constant, but loss decreases

Epoch 1/5

4/4 [==============================] - 63s 5s/step - loss: 0.6805 - mean_io_u: 0.4123 - val_loss: 0.6551 - val_mean_io_u: 0.4242 Epoch 2/5 4/4 [==============================] - 17s 4s/step - loss: 0.6424 - mean_io_u: 0.4123 - val_loss: 0.6088 - val_mean_io_u: 0.4242 Epoch 3/5 4/4 [==============================] - 18s 5s/step - loss: 0.5921 - mean_io_u: 0.4123 - val_loss: 0.5423 - val_mean_io_u: 0.4242 Epoch 4/5 4/4 [==============================] - 18s 5s/step - loss: 0.5216 - mean_io_u: 0.4123 - val_loss: 0.4552 - val_mean_io_u: 0.4242 Epoch 5/5 4/4 [==============================] - 18s 5s/step - loss: 0.4459 - mean_io_u: 0.4123 - val_loss: 0.3880 - val_mean_io_u: 0.4242

My training dataset comprises of 4 Hyperspectral images that are 512x512x10, and a validation dataset that has 1 512x512x10 image. They are all normalized 0.0 to 1.0, and are float32s.

I tried changing the model concatenation axes, optimizer, activation function, and even changed python environments

Here is my model definition

IMG_HEIGHT = HEIGHT
IMG_WIDTH = WIDTH
IMG_CHANNELS = CHANNELS


inputs = tf.keras.layers.Input((IMG_HEIGHT, IMG_WIDTH, IMG_CHANNELS,1))
#s = tf.keras.layers.Lambda(lambda x: x / 255)(inputs)
s = inputs

#Contraction path
c1 = tf.keras.layers.Conv3D(32, (3, 3, 3), activation='relu', padding='same')(s)
c1 = tf.keras.layers.Dropout(0.2)(c1)
c1 = tf.keras.layers.Conv3D(32, (3, 3, 3), activation='relu', padding='same')(c1)
p1 = tf.keras.layers.MaxPooling3D((2,2,2))(c1)

c2 = tf.keras.layers.Conv3D(64, (3, 3,3), activation='relu', padding='same')(p1)
c2 = tf.keras.layers.Dropout(0.1)(c2)
c2 = tf.keras.layers.Conv3D(64, (3, 3,3), activation='relu', padding='same')(c2)
p2 = tf.keras.layers.MaxPooling3D((2, 2,2))(c2)

c3 = tf.keras.layers.Conv3D(128, (3, 3,3), activation='relu', padding='same')(p2)
c3 = tf.keras.layers.Dropout(0.2)(c3)
c3 = tf.keras.layers.Conv3D(128, (3, 3,3), activation='relu', padding='same')(c3)
p3 = tf.keras.layers.MaxPooling3D((2, 2,2))(c3)

c4 = tf.keras.layers.Conv3D(256, (3, 3,3), activation='relu', padding='same')(p3)
c4 = tf.keras.layers.Dropout(0.2)(c4)
c4 = tf.keras.layers.Conv3D(256, (3, 3, 3), activation='relu', padding='same')(c4)


flat6 = Flatten()(c4)
output_1 = Dense(8, activation='sigmoid', name='output_1')(flat6)


#Expansive path
u6 = tf.keras.layers.Conv3DTranspose(128, (2,2,2), strides=(2, 2,2), padding='same')(c4)
u6 = tf.keras.layers.concatenate([u6, c3], axis =3)
c6 = tf.keras.layers.Conv3D(128, (3,3,3), activation='relu', kernel_initializer='he_normal', padding='same')(u6)
c6 = tf.keras.layers.Dropout(0.2)(c6)
c6 = tf.keras.layers.Conv3D(128, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(c6)

u7 = tf.keras.layers.Conv3DTranspose(64, (2,2,2), strides=(2, 2,2), padding='same')(c6)
u7 = tf.keras.layers.concatenate([u7, c2], axis =3)
c7 = tf.keras.layers.Conv3D(64, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(u7)
c7 = tf.keras.layers.Dropout(0.2)(c7)
c7 = tf.keras.layers.Conv3D(64, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(c7)

u8 = tf.keras.layers.Conv3DTranspose(32, (2, 2,2), strides=(2, 2,2), padding='same')(c7)
u8 = tf.keras.layers.concatenate([u8, c1], axis =3)
c8 = tf.keras.layers.Conv3D(32, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(u8)
c8 = tf.keras.layers.Dropout(0.1)(c8)
c8 = tf.keras.layers.Conv3D(32, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(c8)

"""
u9 = tf.keras.layers.Conv3DTranspose(16, (2, 2,2), strides=(2, 2,2), padding='same')(c8)
u9 = tf.keras.layers.concatenate([u9, c1], axis=3)
c9 = tf.keras.layers.Conv3D(16, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(u9)
c9 = tf.keras.layers.Dropout(0.1)(c9)
c9 = tf.keras.layers.Conv3D(16, (3, 3,3), activation='relu', kernel_initializer='he_normal', padding='same')(c9)
"""

conv3d_shape = c8.shape

c8 = Reshape((conv3d_shape[1], conv3d_shape[2], conv3d_shape[3]*conv3d_shape[4]))(c8)

output_2 = Conv2D(1, (1, 1), activation='sigmoid', name='output_2')(c8)

model = Model(inputs=[inputs], outputs=[ output_2])

model.compile(optimizer=tf.keras.optimizers.legacy.SGD(), loss=['binary_crossentropy'], metrics=['accuracy'])

here is my fit function history = model.fit(X_train, y_train, epochs = 5, batch_size = 1, validation_data = (X_test, y_test))

AG076
  • 1
  • 1

0 Answers0