0

I have a socket-io server that sends to my frontend an image (base64 encoded) every 20ms. The problem I have right now is that the frontend can't keep up such a high speed.

At the moment I use an image and I change it's src attribute :

// this piece of code is called every 20ms
socket.on("frame2client", (data) => {
    receivedCalls += 1;
    setDelay(`${((timer / receivedCalls) * 1000).toFixed(3)} ms`);

    if (!data) {
        return;
     }

    serverVideo.style.width = CONSTRAINTS.video.width;
    serverVideo.style.height = CONSTRAINTS.video.height;

    serverVideo.src = data;
});

It works whenever I change the fps to 10 (= re renders every 100ms) but it's not fast enough to look smooth. Also, it depends on the size of the image : if I try to set it to 500x500 instead of 250x250, 10 fps doesn't work anymore and I have to set it to 5 fps.

EDIT : The way it works is that I get the user's webcam, I capture a frame, I send it to my server using sockets, the server process the image though openCV and returns it to the client. The server returns the base64 frame created using the openCV frame.

Thanks

j08691
  • 204,283
  • 31
  • 260
  • 272
  • 3
    Radically updating the screen every 20ms will give people seizures... _why are you doing this?_ - if you're trying to stream a video this now how you're meant to do it... Why aren't you doing it properly? https://developer.mozilla.org/en-US/docs/Web/Guide/Audio_and_video_delivery/Live_streaming_web_audio_and_video – Dai Jul 08 '22 at 17:06
  • Also, I'm _really tired_ of this trend where everyone is using `data:` URIs with Base64-encoding for images or even video files - why is everyone afraid of raw binary content all of a sudden? – Dai Jul 08 '22 at 17:07
  • I made some edits so it's clearer. – Bonsai Noodle Jul 08 '22 at 17:09
  • 1
    Have you considered using WebRTC? That's the preferred way to work with (and exchange) audio and video data from the user's own cameras etc. – Dai Jul 08 '22 at 17:09
  • I don't know what it is. Is it made for my use case (see edits in the question) ? – Bonsai Noodle Jul 08 '22 at 17:10
  • @Dai Can you embed raw binary content into a webpage without having to call an external URL? – Barry Carter Jul 08 '22 at 17:12
  • No, I have to ask the server for the frame I want to display – Bonsai Noodle Jul 08 '22 at 17:13
  • @barrycarter Ideally we'd be using `multipart/*` HTTP responses for that... [unfortunately browsers never really supported it](https://stackoverflow.com/questions/1806228/browser-support-of-multipart-responses) - however my point was more about handling images and videos in JavaScript and how so many people now take the easy-way-out with `toDataURI` instead of doing things properly with `toBlob`. I am perfectly fine with using _short_ (i.e. sub ~2KB) `data:` URIs in stylesheets and HTML though, but that's not what the OP is doing. – Dai Jul 08 '22 at 17:30
  • 1
    Don’t attempt to stream a video by sending it frame by frame through a websocket. Look into setting up a media capture recorder, streaming that to the server, running that through openCV, and streaming it back to the client. – skara9 Jul 08 '22 at 17:35
  • 2
    You can't show video by sending based64 encoded frames one at a time. That's ridiculously inefficient. If you want the client to see the equivalent of smooth video, then stream video to them directly since video streams are optimized for what you're doing. Sending, based64 encoded stills one at a time is about the worst possible way to attempt to send video. – jfriend00 Jul 08 '22 at 17:44
  • The problem is that I want mulltiple people to be on the website at the same time and each see only their camera. Is it possible using what you call "media capture recorder" @Dai ? Also, it needs to be real time so when they go on the page they directly see their face. Talking about sockets, it perfectly handles 20 calls a second, the only problem is the html image not rendering fast enough... – Bonsai Noodle Jul 08 '22 at 18:00
  • 1
    @BonsaiNoodle _"it needs to be real time"_ - here's a hint: the "RTC" in _WebRTC_ stands for "**Real-time** Communications" so that should be enough to convince you of the right path forward - and I recommend you follow some WebRTC tutorials first before adapting your current project. – Dai Jul 08 '22 at 18:02
  • I will try! Isn't it compatible with JS frameworks like react for frontend and flask for server ? Anyways thanks!! – Bonsai Noodle Jul 08 '22 at 18:07
  • @Dai I checked on google and it looks like webRTC is used to stream videos between people (like calls) but I want to retrieve the video from a flask server. Here is what I found : https://stackoverflow.com/questions/58931854/how-to-stream-live-video-frames-from-client-to-flask-server-and-back-to-the-clie this guy did the same thing than I did – Bonsai Noodle Jul 08 '22 at 18:23
  • @BonsaiNoodle This is what you want: https://stackoverflow.com/questions/63549278/opencv-python-modeling-server-with-webrtc - they're also using Python, but they're using a Python WebRTC library (`from vidgear.gears.asyncio import WebGear_RTC`) to stream video in and out of Python code and OpenCV without resorting to Base64 encoding or other horrible hacks no-one should be using. – Dai Jul 08 '22 at 18:26
  • @Dai I just checked it but they use a pre recorded video... I want to stream each users webcam. You go to the website and you see your face with some opencv stuff and if I go on the website at the same time, I see my face with opencv stuff. The problem remains : how to send the user's webcam to the server to that it can run through the vidgears module if not frame by frame... – Bonsai Noodle Jul 08 '22 at 19:56

1 Answers1

1

@Dai comment is the answer! I went for Web RTC to transfer the video stream from the client to the server and back to the client.