1

I am making a web app to control a dji ryze tello drone. The drone has a camera which sends the live mp4 video-feed to my backend. I recieve 30 frames a second in base64 on my backend and i am using a websocket to send the base64 data to the frontend. The video format is mp4.

How could i possibly display this video feed live per frame on my frontend?

some info;

how i recieve the data:

function initVideoFeed(socket) {
  videoFeed.on('message', (frame) => {
    socket.emit('videoFrame', frame) // sends the base64 to my frontend
  })
}

This is how i recieve the frames in base64 on my backend: "<Buffer 65 15 dd 07 89 ab e0 ab 46 93 33 5e 06 0f 25 7d 0a f1 85 4d 8c a7 aa be 9c d8 3e 6a af 60 cc 6a b9 be 82 b2 71 55 63 b2 2a 0b 82 23 40 5b 79 0f 40 58 ... 1410 more bytes>"

I have searched online for hours but could only get information on how to send video files to the frontend. I however, do not have files but a continuous feed of frames. That is why i resorted to my last option, which is to ask it on stackoverflow.

I would greatly appreciate any information about my problem(;

Thank you very much!

edit: I will post the code on github once i succeed

  • if you're able to transmit this data to your frontend in real time, then what about use MediaSource Extensions (MSE) to play this MP4 data as it's received. – Louis Jul 30 '23 at 00:49
  • @louis Never head of it, i'll look into it. Thanks! – Lars Verschoor Jul 30 '23 at 01:15
  • will you find out the solution yourself? my comment was helpful? if you need more detail, let me share my example codes, but template. – Louis Jul 31 '23 at 01:53
  • @LarsVerschoor **(1)** From research it seems the video format is actually H264. **(2)** If using the Chrome browser is an option for you, then you can try using its WebCodecs API to decode the frames. There are examples if you need some. **(3)** You could also use MSE, as suggested above, but since you've got raw H264 you must first "mux" the frames into a fragmeted-MP4 format, You must be comfortable to work with arrays – VC.One Aug 02 '23 at 10:35
  • @LarsVerschoor PS: With WebCodecs API you don't need to do any muxing. It will decode your shown H264 frames into a picture. But... If you need to support other browsers (not just Chrome) then you must do the muxing part first and then display by using the MSE method. – VC.One Aug 02 '23 at 10:35
  • @louis sorry, i am still trying to make this work. I have now figured out that what i recieve from the drone is compressed into h264. I recieve it in multiple NAL units. I have now figured out how to turn those NAL units into raw video frames, and am now trying to render those on the frontend with canvas drawImage method. – Lars Verschoor Aug 02 '23 at 20:37
  • @VC.One Thank you! I'll do some research on WebCodecs for sure! – Lars Verschoor Aug 02 '23 at 20:40
  • I thought i had raw video frames. this is an example of one frame: . It says 16620 more bytes. Seems a bit small for a 1280 x 720p video – Lars Verschoor Aug 02 '23 at 20:56
  • @LarsVerschoor A frame starting with byte **0x41** after the start code is a **P-frame**. These types of frames can be small because they only hold the minor changes from a previous **key-frame** (_eg:_ starts with byte **0x65**). Don't worry, as long as it displays fine when sent after some keyframe data which was also given to the decoder. – VC.One Aug 05 '23 at 17:11

0 Answers0