I'm trying to feed images of dimension (350x350x3)
as the Input shape and I want to train the network to output a (1400x1400x3)
image (4x upscale).
My training dataset consists of 8 images 1400x1400x3
which I flip around and such to get a total of 32 images for validation.
Then, I scale those 32 images down to 350x350x3
to obtain the input images which will be cross-validated with their 32 other counterparts.
print(type(validateData))
print(validateData.shape)
print(type(validateData[0].shape))
print(validateData[0].shape)
returns
<class 'numpy.ndarray'>
(32,)
<class 'tuple'>
(1400, 1400, 3)
And, similarly:
print(type(trainingData)) # <class 'numpy.ndarray'>
print(trainingData.shape) # (32,)
print(type(trainingData[0].shape)) # <class 'tuple'>
print(trainingData[0].shape) # (350, 350, 3)
So when I do
model.fit(trainingData,
validateData,
epochs=5,
verbose=2,
batch_size=4) # 32 images-> 8 batches of 4
What exactly am I supposed to feed as the two first parameters of the .fit
function?
As it is, I am getting this error:
ValueError: Error when checking input: expected input_1 to have 4 dimensions, but got array with shape (32, 1)
Here is my full code if you want to look into it.
Keras API isn't very explicit about the formatting of the data that should be fed:
fit
fit(x=None, y=None, batch_size=None, epochs=1, verbose=1, callbacks=None, validation_split=0.0, validation_data=None, shuffle=True, class_weight=None, sample_weight=None, initial_epoch=0, steps_per_epoch=None, validation_steps=None)
Trains the model for a given number of epochs (iterations on a dataset).
Arguments
x: Numpy array of training data (if the model has a single input), or list of Numpy arrays (if the model has multiple inputs). If input layers in the model are named, you can also pass a dictionary mapping input names to Numpy arrays. x can be None (default) if feeding from framework-native tensors (e.g. TensorFlow data tensors).
y: Numpy array of target (label) data (if the model has a single output), or list of Numpy arrays (if the model has multiple outputs). If output layers in the model are named, you can also pass a dictionary mapping output names to Numpy arrays. y can be None (default) if feeding from framework-native tensors (e.g. TensorFlow data tensors).
Here is the full code of another implementation that I've tried using. This time, I changed the parameters to be python lists of np_arrays (each image being the 3D np_array). I now get this error:
ValueError: Error when checking model input: the list of Numpy arrays that you are passing to your model is not the size the model expected. Expected to see 1 array(s), but instead got the following list of 32 arrays: [array([[[0.6774938 , 0.64219969, 0.60690557],
[0.67257049, 0.63743775, 0.60206295],
[0.67203473, 0.6418085 , 0.60398018],
...,
[0.55292714, 0.5253832 , 0.46217287],
...
It's hard to know if I'm closer or further.