I have a dataframe with two columns; the first includes a sentence and the second is a target label (9 in total - sentence can be classified to more than one label).
I have used word2vec to vectorise the text and thats resulted in an array with length 64.
The initial problem I had
Tensorflow - ValueError: Failed to convert a NumPy array to a Tensor (Unsupported object type float)
To overcome this I have converted the np.array to
train_inputs = tf.convert_to_tensor([df_train_title_train])
But now I am getting a new problem - see below.
I have been researching stackflow and other sources for days and am struggling to get my simple neural network to work.
print(train_inputs.shape)
print(train_targets.shape)
print(validation_inputs.shape)
print(validation_targets.shape)
print(train_inputs[0].shape)
print(train_targets[0].shape)
(1, 63586, 64)
(63586, 9)
(1, 7066, 64)
(7066, 9)
(63586, 64)
(9,)
# Set the input and output sizes
input_size = 64
output_size = 9
# Use same hidden layer size for both hidden layers. Not a necessity.
hidden_layer_size = 64
# define how the model will look like
model = tf.keras.Sequential([
tf.keras.layers.Dense(hidden_layer_size, activation='relu'), # 1st hidden layer
tf.keras.layers.Dense(hidden_layer_size, activation='relu'), # 2nd hidden layer
tf.keras.layers.Dense(hidden_layer_size, activation='relu'), # 2nd hidden layer
tf.keras.layers.Dense(output_size, activation='softmax') # output layer
])
# model.compile(optimizer='adam', loss='categorical_crossentropy', metrics=['accuracy'])
model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])
### Training
# That's where we train the model we have built.
# set the batch size
batch_size = 10
# set a maximum number of training epochs
max_epochs = 10
# fit the model
# note that this time the train, validation and test data are not iterable
model.fit(train_inputs, # train inputs
train_targets, # train targets
batch_size=batch_size, # batch size
epochs=max_epochs, # epochs that we will train for (assuming early stopping doesn't kick in)
validation_data=(validation_inputs, validation_targets), # validation data
verbose = 2 # making sure we get enough information about the training process
)
Error Message
/usr/local/lib/python3.7/dist-packages/tensorflow/python/keras/engine/data_adapter.py in _check_data_cardinality(data)
1527 label, ", ".join(str(i.shape[0]) for i in nest.flatten(single_data)))
1528 msg += "Make sure all arrays contain the same number of samples."
-> 1529 raise ValueError(msg)
1530
1531
ValueError: Data cardinality is ambiguous:
x sizes: 1
y sizes: 63586
Make sure all arrays contain the same number of samples.