0

I have a python3.6 program, using nibabel package to analyze medical images in NIFTI format.

import glob
import nibabel as nib
health = [nib.load(pt) for pt in glob.glob(healthdir+'*.nii')] # len = 200
health_data = [h.get_data() for h in health]

It occurred OSError: [Errno 24] Too many open files in the last line. I used following code and found out that it occurred the error in the last element.

health_data = []
for i in range(len(health)):
    try:
        health_data.append(health[i].get_data())
    except:
        print(i) # 199

I have tried to search relative topic such as Nibabel: IOError: [Errno 24] Too many open files:. However, it doesn't solve the problem. Also, I prefer not to use ulimit. Thanks!

djfire
  • 1

3 Answers3

0

Not familliar with Nibabel but Try with

    health_data = []
    for filepath in glob.glob(healthdir+'*.nii'):
       with nib.load(filepath) as health:
           health_data.append(health.get_data())

**NOT TESTED

webbyfox
  • 1,049
  • 10
  • 22
0

You may need to delete the object after using it.

def show_origin_image(name,s=100,max_limit=None, min_limit=None):
    origin = name
    file_name_list = [each for each in os.listdir(origin) if not each.startswith('.')]
    file_name_list = file_name_list[min_limit:max_limit]
    dimension = 2
    width_num = 6
    height_num = math.ceil(len(file_name_list) / width_num)
    plt.figure(figsize=(15, height_num * 2.8))
    data_list = []
    for n,each in enumerate(file_name_list, 1):
        agent = nib.load(os.path.join(origin, each), keep_file_open=False)
        three_d_data = np.asarray(agent.dataobj)
        size = three_d_data.shape
        image = np.take(three_d_data, s, dimension)
        plt.subplot(height_num, width_num, n)
        plt.imshow(image, 'gray')
        plt.axis('off')
        data_list.append(three_d_data)
        # add delete operation!
        del agent
    return data_list
Weiziyoung
  • 191
  • 1
  • 2
  • 12
  • Got the same problem and deleting the object does not help. However my loop stopped at image 4005 (from 10 000). So neither the last nor the 200th element. But that might come from the powerful server I am working on – Pibe_chorro Sep 06 '21 at 12:29
0

I had the same problem importing a number of self generated NIfTI images.
Using nilearn instead of nibabel solved the problem for me.

from nilearn.image import smooth_img
import glob
image_dir = glob.glob(some_path + '*.nii')
images = smooth_img(image_dir, fwhm=None)
image_maps = []
for img in images:
    img_data = img.get_fdata()
    image_maps.append(img_data)
    del img_data

Worked with 10 000 images for me and took around 12min.
smooth_img reads in the nifti and applies a smoothing kernel with size fwhm (full width half maximum... I think). I did this because it works and I need this smoothing in a different situation in the script. You can check out nilear.image.load_img. It should do the same.

Best

Pibe_chorro
  • 99
  • 1
  • 8