0

I am facing this issue while running a model of 10 positive and 10 negative training data.


MemoryError: Unable to allocate 137. MiB for an array with shape (3000, 4000, 3) and data type float32


My code is this:

# Define a single function that can extract features using hog sub-sampling and make predictions
def find_cars(img, ystart, ystop, scale, svc,orient, 
              pix_per_cell, cell_per_block,show_all_rectangles=False):
    
    # array of rectangles where cars were detected
    rectangles = []
    
    img = img.astype(np.float32)/255
    
    img_tosearch = img[ystart:ystop,:,:]

    # apply color conversion if other than 'RGB'
    ctrans_tosearch = cv2.cvtColor(img_tosearch, cv2.COLOR_RGB2YUV)
    
    # rescale image if other than 1.0 scale
    imshape = ctrans_tosearch.shape
    ctrans_tosearch = cv2.resize(ctrans_tosearch, (np.int(imshape[1]/scale), np.int(imshape[0]/scale)))
    
    ch1 = ctrans_tosearch[:,:,0]
    ch2 = ctrans_tosearch[:,:,1]
    ch3 = ctrans_tosearch[:,:,2]
    
    # Define blocks and steps as above
    nxblocks = (ch1.shape[1] // pix_per_cell)+1  #-1
    nyblocks = (ch1.shape[0] // pix_per_cell)+1  #-1 
    
    nfeat_per_block = orient*(cell_per_block**2)
    
    # 64 was the orginal sampling rate, with 8 cells and 8 pix per cell
    window = 64
    nblocks_per_window = (window // pix_per_cell)-1 
    cells_per_step = 2  # Instead of overlap, define how many cells to step
    nxsteps = (nxblocks - nblocks_per_window) // cells_per_step
    nysteps = (nyblocks - nblocks_per_window) // cells_per_step
    
    # Compute individual channel HOG features for the entire image
    hog1 = get_hog_features(ch1, orient, pix_per_cell, cell_per_block, feature_vec=False)   
    hog2 = get_hog_features(ch2, orient, pix_per_cell, cell_per_block, feature_vec=False)
    hog3 = get_hog_features(ch3, orient, pix_per_cell, cell_per_block, feature_vec=False)
    
    for xb in range(nxsteps):
        for yb in range(nysteps):
            ypos = yb*cells_per_step
            xpos = xb*cells_per_step
            
            # Extract HOG for this patch
            hog_feat1 = hog1[ypos:ypos+nblocks_per_window, xpos:xpos+nblocks_per_window].ravel()
            hog_feat2 = hog2[ypos:ypos+nblocks_per_window, xpos:xpos+nblocks_per_window].ravel() 
            hog_feat3 = hog3[ypos:ypos+nblocks_per_window, xpos:xpos+nblocks_per_window].ravel() 
            hog_features = np.hstack((hog_feat1, hog_feat2, hog_feat3))
            xleft = xpos*pix_per_cell
            ytop = ypos*pix_per_cell
                                  
            test_prediction = svc.predict(hog_features.reshape(1,-1))
            
            if test_prediction == 1 or show_all_rectangles:
                xbox_left = np.int(xleft*scale)
                ytop_draw = np.int(ytop*scale)
                win_draw = np.int(window*scale)
                rectangles.append(((xbox_left, ytop_draw+ystart),(xbox_left+win_draw,ytop_draw+win_draw+ystart)))
                
    return rectangles

test_img = mpimg.imread('C:/Users/shrey/Desktop/Mod1-IITR/DJI_0314.JPG')

ystart = 20
ystop = 50
scale = 1
orient = 11
pix_per_cell = 16
cell_per_block = 2

rectangles = find_cars(test_img, ystart, ystop, scale, svc,orient, pix_per_cell, cell_per_block)

print(len(rectangles), 'rectangles found in image')

I am working on Windows- Jupyter notebook. How can I correct this code issue?

desertnaut
  • 57,590
  • 26
  • 140
  • 166
  • 1
    Where are you confused? The error message seems clear: you're trying to allocate 137Mb of data, and you don't have that available. – Prune Aug 04 '20 at 22:29

1 Answers1

1

The error is as it says, it is unable to allocate memory, because there isn't enough.

However, chances are you have much more memory on your system available than 137MiB, jupyter is simply limiting how much memory can be used from your system, because it has to reserve some memory before you start running your code. See here for how to increase the memory limit in jupyter

xzkxyz
  • 527
  • 3
  • 4
  • 1
    While this link may answer the question, it is better to include the essential parts of the answer here and provide the link for reference. Link-only answers can become invalid if the linked page changes. - [From Review](/review/low-quality-posts/26859200) – Simas Joneliunas Aug 05 '20 at 02:36
  • @SimasJoneliunas I don't really see the problem, the link is not the solution, it's merely speculating at the cause of the problem. Its not a "link only answer". As far as we know, jupyter is out of memory, we have no additional information nor an attempt to fix the problem. We can't fix issues that aren't described. It's likely that the linked question just happens to be the root cause. If it is, the entire question is a duplicate and instead of voting to delete the answer, you should vote for the question be closed. – xzkxyz Aug 05 '20 at 15:58