I have a small Python OpenCV project and would like to stream my processed frames to HTTP using ffmpeg.
For this I used the following sources: Pipe and OpenCV to FFmpeg with audio streaming RTMP in Python and https://github.com/kkroening/ffmpeg-python/blob/master/examples/README.md#stream-from-a-local-video-to-http-server
To make things more readable I used the ffmpeg-python library but as far as I understand, it doesn't matter if I open a pipe using subprocess or use the library.
The problem that I have is, that I always get a broken pipe or "Connection refused" when I open the stream with ffplay.
import ffmpeg
import cv2
video_format = "flv"
server_url = "http://localhost:8080"
cap = cv2.VideoCapture(1)
width = int(cap.get(cv2.CAP_PROP_FRAME_WIDTH))
height = int(cap.get(cv2.CAP_PROP_FRAME_HEIGHT))
process = (
ffmpeg
.input('pipe:', format='rawvideo',codec="rawvideo", pix_fmt='bgr24', s='{}x{}'.format(width, height))
.output(
server_url,
#codec = "copy", # use same codecs of the original video
listen=1, # enables HTTP server
codec="libx264",
pix_fmt="yuv420p",
preset="ultrafast",
f=video_format)
.overwrite_output()
.run()
)
while True:
ret, frame = cap.read()
if not ret:
break
print("Sending frame")
process.stdin.write(frame.tobytes())
I also tried to stream using only ffmpeg and FaceTime and that works as i expected.
My operation system is MacOS 12.3
Maybe someone knows how to fix this.
Thanks for your help
Chris