11

This question may sound a little bit complex or ambiguous, but I'll try to make it as clear as I can. I have done lots of Googling and spent lots of time but didn't find anything relevant for windows.

I want to play two videos on a single screen. One as full screen in background and one on top of it in a small window or small width/height in the right corner. Then I want an output which consists of both videos playing together on a single screen.

So basically one video overlays another and then I want that streamed as output so the user can play that stream later.

I am not asking you to write the whole code, just tell me what to do or how to do it or which tool or third party SDK I have to use to make it happen.

update: Tried a lots of solution.

1.Xuggler- doesn't support Android.

2.JavaCV or JJMPEG- not able to find any tutorial which suggested how to do it?

Now looking for FFMPEG- searched for a long time but not able to find any tutorial which suggest the coding way to do it. I found command line way to how to fix it. So can anyone suggest or point the tutorial of FFMPEG or tell any other way to

Community
  • 1
  • 1
Anil
  • 209
  • 3
  • 10
  • 1
    stuck on this problem for a long time, tried to use ffmpeg in windows. but don't have strong knowledge of c/c++. and also not able to get the code on google. – Vikas Gupta Mar 19 '13 at 10:45
  • I also tried FFMpeg, and same problem due to lack of knowledge, is there any way we can fix it by java.. – Anil Mar 19 '13 at 10:48
  • If you are able to do PIP(Picture In Picture) then i have a simple solution for u. – Varun Mar 19 '13 at 12:00
  • @varun PIP ? and what's the solution please ? – Anil Mar 19 '13 at 12:05
  • In case of PIP both video duration should be same, now you have to marge both video by using ffmpeg and start to play from Zero min in full screen mode and small screen mode you have to seek that video by total duration divided by two. – Varun Mar 19 '13 at 12:14
  • @varun can you point to any tutorial or write an answer to explain it a bit more.. that'd be a great help. :) – Vikas Gupta Mar 21 '13 at 13:35
  • @Anil, how important is the speed of this operation? – Phil Apr 09 '13 at 15:46
  • You could also look at SMIL http://en.wikipedia.org/wiki/Synchronized_Multimedia_Integration_Language . Not sure if there is a player for android but something worth checking out. – Menelaos Apr 09 '13 at 21:50
  • Any idea or solution please? – AnswerZhao Feb 28 '17 at 05:45

3 Answers3

1

I would start with JavaCV. It's quite good and flexible. It should allow you to grab frames, composite them and write them back to a file. Use FFmpegFrameGrabber and Recorder classes. The composition can be done manually.

The rest of the answer depends on few things:

  • do you want to read from a file/mem/url?
  • do you want to save to a file/mem/url?
  • do you need realtime processing?
  • do you need something more than simple picture-in-picture?
Zielony
  • 16,239
  • 6
  • 34
  • 39
  • thanks for replying so quickly, but right now i want to read two video files and then show then together and get that stream as an output.. BTW does it works in windows environment.. – Anil Mar 19 '13 at 10:53
  • 1
    So JavaCV is the way to go. – Zielony Mar 19 '13 at 10:55
  • 1
    I've found it in google: http://www.quack-ware.com/2011/include-ffmpeg-in-android-and-grab-a-frame-from-a-3gp-video/ FFmpegFrameGrabber is very simple to use, so you can just read the documentation and go for it by yourself – Zielony Mar 19 '13 at 11:04
  • @Zielony Thanks, but it is not showing how to play two videos and retrieve it as an output.. – Anil Mar 19 '13 at 11:28
  • @Zielony hey, can you tell me any good tutorial or docs, because the one you mentioned doesn't tell how to play two different videos. Do i have to go for jni code? because i don't have any idea of that. – Vikas Gupta Mar 21 '13 at 13:33
  • what happens with the audio playback if he uses a computer vision library for this – Menelaos Apr 10 '13 at 08:22
0

You could use OpenGL to do the trick. Please note however that you will need to have to render steps, one rendering the first video in a FBO and then the second rendering the second video, using the FBO as TEXTURE0 and the second as EXTERNAL_TEXTURE.

Blending, and all the stuff you want would be done by OpengL.

You can check the source codes here: Using SurfaceTexture in Android and some important information here: Android OpenGL combination of SurfaceTexture (external image) and ordinary texture

The only thing I'm not sure is what happens when two instances of mediaplayer are running in Parallel. I guess it should not be a problem.

Community
  • 1
  • 1
Marco
  • 984
  • 10
  • 18
-2

ffmpeg is a very active project, lot's of changes and releases all the time.

You should look at the Xuggler project, this provides a Java API for what you want to do, and they have tight integration with ffmpeg.

http://www.xuggle.com/xuggler/

Should you choose to go down the Runtime.exec() path, this Red5 thread should be useful:

http://www.nabble.com/java-call-ffmpeg-ts15886850.html

Varun
  • 373
  • 1
  • 2
  • 11