Its is possible in keras to load only one batch in memory at a time as I have 40GB dataset of images.
If dataset is small I can used ImageDataGenerator to generator batches but due large dataset I can't load all images in memory.
Is there any method in keras to do something similar to following tensorflow code:
path_queue = tf.train.string_input_producer(input_paths, shuffle= False)
paths, contents = reader.read(path_queue)
inputs = decode(contents)
input_batch = tf.train.batch([inputs], batch_size=2)
I am using this method to serialize inputs in tensorflow but I don't know how to achieve this task in Keras.