1

I want to load 2255 image files into a numpy array. I stop at 100 image files into the np array. Does colab or keras.preprocessing.image import img_to_array, load_img have a limit to the number of files that can be loaded into numpy array?

My code is as follows;

enter code here
import os, sys
from IPython.display import display
from IPython.display import Image as _Imgdis
from PIL import Image
import numpy as np
from scipy import ndimage
from keras.preprocessing.image import img_to_array, load_img

folder = "/content/drive/MyDrive/PROJECTS/LEAF/Data/TRAINING/images"

onlyfiles = os.listdir(folder)
print("Working with {0} images".format(len(onlyfiles)))
print("Image examples: ")

for i in range(len(onlyfiles)):
        img = load_img(folder + "/" + onlyfiles[i])
        x = img_to_array(img)

print(len(x))

the code executes and the result is; Working with 2255 images Image examples: 100

I expect len(x) = 2255 because the loop is set to go 2255 (number of file in folder). Does colab limit load_image, img_to_array or the loop? Please help.

Thanks,

Joe
  • 11
  • 1
  • I think the biggest limitation will be the memory colab provides... Remember you can use iterators instead of loading everything into memory at once. – Celius Stingher Dec 10 '20 at 16:33
  • If you really want to load everything into memory at once, you can do some math on the data-size in memory as it is done [here with lists](https://stackoverflow.com/questions/7247298/size-of-list-in-memory), knowing that Google Colab provide you with 12 GB of RAM. – ans Mar 24 '21 at 08:11

0 Answers0