I am trying to use FFmpeg in python as a subprocess for capturing screen and converting to numpy array using pipe. This for a desktop sharing software. I have two codes I've written:
1st case: It doesn't work at all except for the FFmpeg cmd.
Code:
cmd = "ffmpeg -f gdigrab -framerate 30 -i desktop -f h264 pipe:1"
pipe = subprocess.run(cmd,
stdout=subprocess.PIPE,
stderr=subprocess.PIPE,
bufsize=2)
print(pipe.stdout, pipe.stderr)
2nd case: It works but I get a numpy array every 3 - 4 seconds only.
Code:
cmd = 'ffmpeg -f gdigrab -framerate 30 -i desktop -f h264 pipe:1'
size = 480 * 240 * 3
proc = sp.Popen(cmd, stdout=sp.PIPE)
while True:
try:
tic = time.perf_counter()
frame = proc.stdout.read(size)
#print(frame)
if frame is None:
print('no input')
break
image = np.frombuffer(frame, dtype='uint8').reshape(240, 480, 3)
toc = time.perf_counter()
print(f"performed calc in {(toc - tic) * 1000:0.4f} miliseconds")
cv2.imshow('received', image)
except Exception as e:
print(e)
cv2.destroyAllWindows()
proc.stdin.close()
proc.wait()
I think it is buffering the frames during that 3 seconds, but that would be terrible for a desktop sharing setup. I am trying to get as low latency between capturing and streaming as possible.
Any help would be greatly appreciated!