In keras/keras/engine/training.py
def standardize_input_data(data, names, shapes=None,
check_batch_dim=True,
exception_prefix=''):
...
# check shapes compatibility
if shapes:
for i in range(len(names)):
...
for j, (dim, ref_dim) in enumerate(zip(array.shape, shapes[i])):
if not j and not check_batch_dim:
# skip the first axis
continue
if ref_dim:
if ref_dim != dim:
raise Exception('Error when checking ' + exception_prefix +
': expected ' + names[i] +
' to have shape ' + str(shapes[i]) +
' but got array with shape ' +
str(array.shape))
Comparing that with the error
Error when checking : expected input_1 to have shape (None, 192) but got array with shape (192, 1)
So it is comparing (None, 192)
with (192, 1)
, and skipping the 1st axis; that is comparing 192
and 1
. If array
has shape (n, 192)
it probably would pass.
So basically, what ever is generating the (192,1)
shape, as opposed to (1,192)
or a broadcastable (192,)
is causing the error.
I'm adding keras
to the tags on the guess that this is the problem module.
Searching other keras
tagged SO questions:
Exception: Error when checking model target: expected dense_3 to have shape (None, 1000) but got array with shape (32, 2)
Error: Error when checking model input: expected dense_input_6 to have shape (None, 784) but got array with shape (784L, 1L)
Dimensions not matching in keras LSTM model
Getting shape dimension errors with a simple regression using Keras
Deep autoencoder in Keras converting one dimension to another i
I don't know enough about keras
to understand the answers, but there's more to it than simply reshaping your input array.