1
tydef prepare_data(batch_size):
(X_train, y_train)=load_data(TRAIN_DIR)
(X_test, y_test) = load_data(TEST_DIR)
X_all = np.concatenate([X_train, X_test])
y_all = np.concatenate([y_train, y_test])
X_all = X_all.astype(np.float32) / 255
X_all = X_all.reshape(-1, 28, 28, 1) * 2. - 1.
y_all = keras.utils.to_categorical(y_all, 10)

dataset = tf.data.Dataset.from_tensor_slices((X_all, y_all))
dataset = dataset.shuffle(1024)
dataset = dataset.batch(BATCH_SIZE, drop_remainder=True).prefetch(1)pe here
return dataset

This is the script to load the directory files using TRAIN_DIR variable.
But when I call the function dataset = prepare_data(BATCH_SIZE) it says "too many values to unpack (expected 2)".
Can you share your experiences?

HoRn
  • 1,458
  • 5
  • 20
  • 25

1 Answers1

0

Based on the comments, you have a function load_data like this:

def load_data(dir_path, img_size=(100,100)):
    """ Load resized images as np.arrays to workspace """
    X = []
    y = []
    i = 0
    label = dict()
    X = np.array(X)
    y = np.array(y)
    print(f'{len(X)} images loaded from {dir_path} directory.')
    return X, y, label

which return two numpy arrays and one dictionary.

So I would change the beginning of the function prepare_data like so:

def prepare_data(batch_size):
   X_train, y_train, label_train = load_data(TRAIN_DIR)
   X_test, y_test, label_test = load_data(TEST_DIR)

to match load_data signature.

Till
  • 4,183
  • 3
  • 16
  • 18