0

I have a NN whose input is a flattened (645, 39), has 2 hidden layers, and has a single output.

import tensorflow as tf
import keras

import numpy as np
import matplotlib.pyplot as plt

import h5py

model = keras.Sequential([
    keras.layers.Flatten(input_shape=DATA_CHUNKS),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(128, activation='relu'),
    keras.layers.Dense(1, activation='softmax')
])

model.compile(optimizer='adam',
              loss='categorical_crossentropy',
              metrics=['accuracy'])

In order to feed the training data and labels to it I have created a generator which takes chunks of data from an HDF5 dataset, along with their labels and yields them.

def get_data(file=all_data_path, chunks=10):
    hf = h5py.File(file, 'r')
    for i in range(hf["mfccs"].shape[0] // chunks + 1):
        X = hf["mfccs"][i : i + chunks]
        y = hf["readers"][i : i + chunks]
        yield (X, y)
    hf.close()

Just to inform you on what my generator returns:

X, y = next(get_data())
print(X.shape)
print(y.shape)

has this output:

(10, 645, 39)
(10, 1)

When I try to fit the model, hist = model.fit(get_data(), epochs=1, shuffle=True), the accuracy and loss are stuck at 0, which seems odd.

Although this question is relevant, using keras instead of tf.keras didn't seem to change anything. What I suspect to be wrong is something with my generator, but I am not sure what it's supposed to yield. Thank you in advance!

BillTheKid
  • 377
  • 1
  • 13
  • 1
    No, the problem is that you cannot use softmax with one neuron (think about it, look at the equation and which value would be produced). – Dr. Snoopy Feb 20 '22 at 10:52
  • @Dr.Snoopy Yup! that was it! thank you, I changed the activation function to sigmoid, and my loss function to binary cross entropy which, thanks to your comment, now seem better, and it works! – BillTheKid Feb 20 '22 at 10:59

0 Answers0