3

I am building an app with AVAssetWriter where i add overlay to a video. It works great when I don't try to add overlays. But when I add overlays the video looks cropped from half (as you can see in the screenshot).

Here is my addOverlayToImage function:

func addOverlayToImage(from filteredImage: UIImage) -> UIImage {
   UIGraphicsBeginImageContextWithOptions(self.imageView.frame.size, false, 1.0);
   self.imageView.layer.render(in: UIGraphicsGetCurrentContext()!)
   let imageWithText = UIGraphicsGetImageFromCurrentImageContext()
   UIGraphicsEndImageContext();
   return imageWithText! 
}

I call the function inside captureOutput:

func captureOutput(_ captureOutput: AVCaptureOutput, didOutputSampleBuffer sampleBuffer: CMSampleBuffer!, from connection: AVCaptureConnection) {
   self.bufferVideoQueue.async() {
     let imageWithOverlay = self.addOverlayToImage(from: self.filteredImage)
     let buffer = self.imageToBuffer(from: imageWithOverlay)
     self.assetWriterPixelBufferInput?.append(buffer!, withPresentationTime: self.currentTime)
   }
}

And the imageToBuffer function:

func imageToBuffer(from image: UIImage) -> CVPixelBuffer? {
    let attrs = [
        String(kCVPixelBufferCGImageCompatibilityKey) : kCFBooleanTrue,
        String(kCVPixelBufferCGBitmapContextCompatibilityKey) : kCFBooleanTrue
    ] as [String : Any]
    var buffer : CVPixelBuffer?
    let status = CVPixelBufferCreate(kCFAllocatorDefault, Int(image.size.width), Int(image.size.height), kCVPixelFormatType_32ARGB, attrs as CFDictionary, &buffer)
    guard (status == kCVReturnSuccess) else {
        return nil
    }

    CVPixelBufferLockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))
    let pixelData = CVPixelBufferGetBaseAddress(buffer!)

    let rgbColorSpace = CGColorSpaceCreateDeviceRGB()
    let context = CGContext(data: pixelData, width: Int(image.size.width), height: Int(image.size.height), bitsPerComponent: 8, bytesPerRow: CVPixelBufferGetBytesPerRow(buffer!), space: rgbColorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue)

    context?.translateBy(x: 0, y: image.size.height)
    context?.scaleBy(x: 1.0, y: -1.0)

    UIGraphicsPushContext(context!)
    image.draw(in: CGRect(x: 0, y: 0, width: image.size.width, height: image.size.height))
    UIGraphicsPopContext()
    CVPixelBufferUnlockBaseAddress(buffer!, CVPixelBufferLockFlags(rawValue: 0))

    return buffer
}

And a screenshot from the video:

enter image description here

hackio
  • 776
  • 2
  • 8
  • 18

2 Answers2

1

Have you tried configuring the AVAssetWriterInput to specify the data is in real time (set expectsMediaDataInRealTime to true)? This makes a big difference when writing real time (live camera) data and can cause the output to be laggy if not set properly.

https://developer.apple.com/documentation/avfoundation/avassetwriterinput

https://developer.apple.com/documentation/avfoundation/avassetwriterinput/1387827-expectsmediadatainrealtime

Tim Bull
  • 2,375
  • 21
  • 25
  • Hey, i actually use expectsMediaDataInRealTime. And also the video is not laggy unless i add overlay. addOverlayToImage function makes it laggy. – hackio Sep 08 '17 at 19:23
  • That CVPixelBuffer conversion is very slow unfortunately. One thing you could look at is counting how many buffers are being vended and how many you are converting. If you are writing the video at a typical rate of 30FPS, it's possible you're being vended many more CVPixelBuffers than that, in which case the conversion is excessive. Try count the buffers and make sure you're only transforming as close to the ones need for the desired framrate as you can. – Tim Bull Sep 08 '17 at 20:57
  • Hey Tim, thanks for the detailed anwer, but i really couldn't understand how to make the control. Btw i didn't make any changes on fps, so i guess it is the default value. – hackio Sep 11 '17 at 13:15
  • Btw adding overlays like https://stackoverflow.com/a/28907826/1898010 stops lagging, however my approach seems more useful to me. – hackio Sep 11 '17 at 13:53
  • Remember that the rate at which the camera vends buffers and the end FPS are two separate concepts. The camera throws buffers as fast as it can, the FPS is based on the end video encoding. My guess is when you're doing the overlay, you're doing way too much conversion. Good luck! – Tim Bull Sep 11 '17 at 18:40
0

I don't have a definitive answer because I haven't seen that error but I think it might be happening because the processing you're doing on the frame is taking longer than the frame time (1/30th of a second, probably).

My suggestion would be to reduce the time as much as possible. From what I can see you're creating a UIImage from a UIView and then converting that into a CVPixelBuffer.

All of this is happening every frame, however, your content doesn't seem like it needs to change every frame.

I would suggest you store the buffer and in captureOutput(...) add some logic to see if the content has changed. If it hasn't you can use the stored buffer, if it has, you can recalculate it, but now that'll only happen every minute (I'm assuming that from your screenshot), so it shouldn't affect the video.

Finally, you're executing the code asynchronously, this might be causing issues so I would recommend you remove that part and just execute that code in the delegate method. (NOTE: Disregard this if the docs instruct you to do it asynchronously)

EmilioPelaez
  • 18,758
  • 6
  • 46
  • 50
  • Thanks for the answer. İ actually plan to add overlays which might change in every frame, so this doesn't seem to be a solution for me... – hackio Sep 08 '17 at 17:54
  • I would suggest you try the solution with your current setup to see if that is actually the bug. If that's the case then you will need to reduce the time it takes to render the overlay. Right now you have a `CALayer` which you render to a `CGContext` to create a `UIImage` which you draw in a `CGContext` to copy to a `CVPixelBuffer`. Depending on how you're creating the contents of `imageView` you might be able to reduce all that process to a single step. – EmilioPelaez Sep 08 '17 at 20:07
  • We're facing a similar issue. What would be the single step you're referring to to create the contents of a `imageView`? For instance, we have a text overlay that only needs to change once every 5 seconds. – Crashalot Dec 15 '17 at 01:26
  • @Crashalot check the last block of OP's code. If you are drawing your `UIImage` with `CoreGraphics` you can use that as reference to go straight from `CoreGraphics` to a `CVPixelBuffer`. This will reduce the number of times the texture is being copied, which was most likely causing OP's bug. – EmilioPelaez Dec 15 '17 at 15:33
  • @EmilioPelaez but how do you convert from a CVPixelBuffer to a CMSampleBuffer, which is what you need for an AVAssetWriterInput? Thanks! – Crashalot Dec 15 '17 at 19:38