3
frame_count = 1200 
frame_dur = 12
def color_array_ext(folder_path, window):
    c_array =np.zeros((1,3))
    for frame in range(frame_count)):
         if (frame + .5 * frame_dur) % frame_dur == 0:
         cmd = 'ffmpeg -i {} -ss {} -vframes 1 C:\\Users\Owner\\Color_Sequence\\test{}.png'.format(folder_path, .5 * frame / frame_dur, frame)
         subprocess.call(cmd.split())
         sample = image.imread('C:\\Users\\Ryan\\Color_Sequence\\test%s.png' %frame)
         h_center = sample.shape[0]/2
         w_center = sample.shape[1]/2
         frame_window = sample[h_center - window:h_center + window, w_center-window: w_center+window]
         transformed_frame_window = frame_window.reshape(-1,3)
         window_avg = np.mean(transformed_frame_window, axis=0)
         c_array = np.concatenate((c_array, np.array([window_avg])), axis=0)
  c_array_A = c_array[1:]

I have a QuickTime video that just cycles through colors every 12 frames for 1200 frames. I'm trying to create a script to extract an average RGB value of the center of the frame for every color in the video and store it in a NumPy array.

I tried to avoid storing an entire image sequence of PNGs and then iterating through them to get the RGB values. Right now the current code only saves the images to disk it needs, but I'm sure there's a better way to do this where the execution doesn't take so long. Currently, the process takes a 10 minutes for 125 samples on my machine, which is unacceptably slow. How can I speed this up?

Karl Knechtel
  • 62,466
  • 11
  • 102
  • 153
  • 3
    You seem to be starting a new `ffmpeg` process for every frame. You need to do it the other way around. Start `ffmpeg` once and pipe the frames into **OpenCV** as BGR888 then read it a frame at a time, discarding ones you don't want... https://stackoverflow.com/a/64460216/2836621 – Mark Setchell Mar 24 '21 at 22:21
  • try to use -preset ultrafast, may be it will help you.. – Vivek Thummar Mar 26 '21 at 04:43

0 Answers0