0

I have a problem where I'm trying to create an AI model , but every time I try to run the code

hist=model.fit(train,epochs=20,validation_data=val,callbacks=[tensorboard_callback])

I get a loss that is just exponentially rising into a minus

Epoch 1/20
18/18 [==============================] - 16s 790ms/step - loss: -1795.6414 - accuracy: 0.1319 - val_loss: -8472.8076 - val_accuracy: 0.1625
Epoch 2/20
18/18 [==============================] - 14s 718ms/step - loss: -79825.2422 - accuracy: 0.1493 - val_loss: -311502.5625 - val_accuracy: 0.1250
Epoch 3/20
18/18 [==============================] - 14s 720ms/step - loss: -1431768.2500 - accuracy: 0.1337 - val_loss: -3777775.2500 - val_accuracy: 0.1375
Epoch 4/20
18/18 [==============================] - 14s 716ms/step - loss: -11493728.0000 - accuracy: 0.1354 - val_loss: -28981542.0000 - val_accuracy: 0.1312
Epoch 5/20
18/18 [==============================] - 14s 747ms/step - loss: -61516224.0000 - accuracy: 0.1372 - val_loss: -127766784.0000 - val_accuracy: 0.1250
Epoch 6/20
18/18 [==============================] - 14s 719ms/step - loss: -251817104.0000 - accuracy: 0.1302 - val_loss: -401455168.0000 - val_accuracy: 0.1813
Epoch 7/20
18/18 [==============================] - 14s 755ms/step - loss: -731479360.0000 - accuracy: 0.1476 - val_loss: -1354252672.0000 - val_accuracy: 0.1375
Epoch 8/20
18/18 [==============================] - 14s 753ms/step - loss: -2031392128.0000 - accuracy: 0.1354 - val_loss: -3004264448.0000 - val_accuracy: 0.1625
Epoch 9/20
18/18 [==============================] - 14s 711ms/step - loss: -4619375104.0000 - accuracy: 0.1302 - val_loss: -7603259904.0000 - val_accuracy: 0.1125
Epoch 10/20
 2/18 [==>...........................] - ETA: 10s - loss: -7608679424.0000 - accuracy: 0.1094

This is the loss function that I am using

model.compile(optimizer='adam',
              loss=tf.keras.losses.BinaryCrossentropy(),
              metrics=['accuracy'])

this is my model

model.add(Conv2D(16,(3,3),1,activation='relu',input_shape=(256,256,3)))
model.add(MaxPooling2D())

model.add(Conv2D(32,(3,3),1,activation='relu'))
model.add(MaxPooling2D())

model.add(Conv2D(16,(3,3),1,activation='relu'))
model.add(MaxPooling2D())

model.add(Flatten())

model.add(Dense(256,activation='relu'))
model.add(Dense(1,activation='sigmoid'))

I've normalized the data by doing

data=data.map(lambda x,y: (x/255, y))

so the values are from 0 to 1

I've read something online about GPU's so I'm not sure if it's that , I can't find a fix , but I'm using this to speed it up

gpus =tf.config.experimental.list_physical_devices('GPU')
for gpu in gpus:
    tf.config.experimental.set_memory_growth(gpu,True)

Any help is welcome!

I'm trying to train a model and get the loss closer to a zero, and accuracy closer to 1, but it's just exponentially driving into minus infinity.

Alphacell
  • 19
  • 3
  • 1
    Can you check if the labels are only 0s and 1s? – Frightera Jun 22 '23 at 20:59
  • @Frightera By labels do you mean the names of the classes or the x values? If you mean by labels of the classes , I have 8 classes, all are from 0-7 . My goal was to make an AI that would recognize car models. But if you meant the x values, I've checked if they are from 0-1 by using `batch=data_iterator.next() for batch in data_iterator.next(): if batch[0].max()>1 or batch[0].min()<0: print("Doesn't work")` After running that code it doesn't show any error, so I don't think it's that, I could be wrong though... – Alphacell Jun 23 '23 at 16:50

1 Answers1

0

if your labels are of shape (num samples, 1), a single column of integers from 0 to 7 (8 classes), for a classification task you should either:

transform your labels to binary features of shape (num samples, 8) and change your last layer and loss function

model.add(Dense(8,activation='softmax'))
model.compile(optimizer='adam',
          loss=tf.keras.losses.CategoricalCrossentropy(),
          metrics=['accuracy'])

or without changing your labels:

model.add(Dense(8,activation='softmax'))
model.compile(optimizer='adam',
          loss=tf.keras.losses.SparseCategoricalCrossentropy(),
          metrics=['accuracy'])
Ioannis Nasios
  • 8,292
  • 4
  • 33
  • 55