0

I have a live camera video stream which i want to send from my FastAPI backend to my Angular frontend in which i want to show the livestream inside my HTML.

In my backend i capture my camera frames with cv2, transform it to bytes and return the stream with StreamingResponse:


@app.get("/streamingtest")
async def streamingtest():
    return StreamingResponse(gen_frames())


async def gen_frames():
    cap = cv2.VideoCapture(0)

    while True:
        # for cap in caps:
        # # Capture frame-by-frame
        success, frame = cap.read()  # read the camera frame
        if not success:
            break
        else:
            ret, buffer = cv2.imencode('.jpg', frame)
            frame = buffer.tobytes()
            yield (b'--frame\r\n'b'Content-Type: image/jpeg\r\n\r\n' + frame + b'\r\n')

I tried visualizing the stream in the HTML with calling the API inside the <img>:

<img src="http://127.0.0.1:8000/streamingtest" width="100%"> 

But that didnt work. I also tried making a request with Angulars HTTPClient:

this.http.get(this.BASE_URL + '/streamingtest', {
      headers: new HttpHeaders({ 'Content-Type': 'application/octet-stream' })
    })

Here it seems like data is being transfered, but i never actually get to see the data, because the request never finishes.

How can i actually stream frames from my FastAPI backend to my Angular frontend and show each frame in my HTML?

0 Answers0