0

I'm looking for a way to stream the video of a Unity Camera, to an Android device, so as to show it on screen.

On Unity's side, I've found a fork of FFMPEGOut which can stream the camera view over UDP, either as raw or RTSP stream. It has about 2s of delay, but it's the only option that I have found.

From Android, I'd like to connect to this stream and somehow extract the frames to YUV frames. I then have an OpenGL Renderer to show these frames on screen.

However I am very confused with the options that I've found:

  • FFMPEG seems pretty complicated to use in an Android app, I couldn't find a simple library without using NDK
  • MediaCodec might also be an option (see this SO), but I am also confused as to how to decode a stream of H264 frames to YUV

I am not even sure what protocol I should use to stream the data (probably over UDP?). Any help or documentation would be greatly appreciated.

Big Bro
  • 797
  • 3
  • 12

0 Answers0