1

I want to play a video (with sound) and record video from the front-facing camera at the same time. The view finder for the camera should appear as a small "picture-in-picture" in the bottom right hand corner of the screen while the movie plays full screen behind it. Is this possible? Is layering the appropriate classes on top of each other possible?

zakdances
  • 22,285
  • 32
  • 102
  • 173

1 Answers1

5

Check out the AVFoundation framework, which is used for much of the audio and video programming in iOS.

In your case you could use an AVPlayer and AVPlayerLayer to play your movie, and an AVCaptureSession, an AVCaptureVideoPreviewLayer, and an AVCaptureMovieFileOutput to record.

If you are familiar with Core Animation, you can set the bounds and add sublayers to AVPlayerLayer and AVCaptureVideoPreviewLayer to achieve you desired interface layout.

These classes are very well documented, and the AVFoundation Programming Guide clearly explains their interaction.

Feel free to comment with any questions.

spudwaffle
  • 2,905
  • 1
  • 22
  • 29
  • I know this is late, but I did this, and I get delay using my method. When the user clicks record, I start recording the `AVCaptureSession`, and when the delegate-method `-sessionDidStartRecording` fires, I call `[player play]` to play the other video. Using this, "as is", the video will be out of sync by about half a second. When the player reaches the end of player-item, I stop the recording. By "aligning" the two videos by using the end of the recording, I get more in sync, but not completely. It turns out, the recorded video starts recording half a second too soon, and stops 2/30 too late. – Sti Feb 12 '14 at 22:07