5

I have SCNView with some object in the middle of screen, user can rotate it, scale, etc.

I want to record all this movements in video and add some sound in realtime. Also I want to record only middle part of SCNView (e.g. SCNView frame is 375x812 but I want only middle 375x375 without top and bottom border). Also I want to show it on screen simultaneously with video capturing.

My current variants are:

func renderer(_ renderer: SCNSceneRenderer, didRenderScene scene: SCNScene, atTime time: TimeInterval) {
    DispatchQueue.main.async {
        if let metalLayer = self.sceneView.layer as? CAMetalLayer, let texture = metalLayer.currentSceneDrawable?.texture, let pixelBufferPool = self.pixelBufferPool {
            //1
            var maybePixelBuffer: CVPixelBuffer? = nil
            let status  = CVPixelBufferPoolCreatePixelBuffer(nil, pixelBufferPool, &maybePixelBuffer)

            guard let pixelBuffer = maybePixelBuffer else { return }

            CVPixelBufferLockBaseAddress(pixelBuffer, [])

            let bytesPerRow = CVPixelBufferGetBytesPerRow(pixelBuffer)
            let region = MTLRegionMake2D(Int(self.fieldOfView.origin.x * UIScreen.main.scale),
                                         Int(self.fieldOfView.origin.y * UIScreen.main.scale),
                                         Int(self.fieldOfView.width * UIScreen.main.scale),
                                         Int(self.fieldOfView.height * UIScreen.main.scale))

            let pixelBufferBytes = CVPixelBufferGetBaseAddress(pixelBuffer)!
            texture.getBytes(pixelBufferBytes, bytesPerRow: bytesPerRow, from: region, mipmapLevel: 0)

            let uiImage = self.image(from: pixelBuffer)

            CVPixelBufferUnlockBaseAddress(pixelBuffer, [])

            //2
            if #available(iOS 11.0, *) {
                var pixelBuffer: Unmanaged<CVPixelBuffer>? = nil
                CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, texture.iosurface!, nil, UnsafeMutablePointer<Unmanaged<CVPixelBuffer>?>(&pixelBuffer))
                let imageBuffer = pixelBuffer!.takeUnretainedValue()
            } else {
                // Fallback on earlier versions
            }

            //3
            var pb: CVPixelBuffer? = nil
            let result = CVPixelBufferCreate(kCFAllocatorDefault, texture.width, texture.height, kCVPixelFormatType_32BGRA, nil, &pb)
            print(result)
            let ciImage = CIImage(mtlTexture: texture, options: nil)
            let context = CIContext()
            context.render(ciImage!, to: pb!)                
        }
    }
}

Obtained CVPixelBuffer will be added to AVAssetWriter.

but all of this methods have some flaws.

1) MTLTexture has colorPixelFormat == 555 (bgra10_XR_sRGB if I recall correctly) and I don't know how to convert it to BGR (to append it to the aseetWriter) nor how to change that colorPixelFormat nor how to add bgra10_XR_sRGB to the aseetWriter.

2) How to implement version for iOS10?

2,3) What is the fastest way to crop an image? Using this methods I can grab only full image instead of cropped one. And I don't want to convert it to UIImage because it too slow.

P.S. my previous viewer was on OpenGL ES(GLKView) and I successfully did it using this technique (overhead 1ms instead of 30ms using .screenshot method)

Joker
  • 263
  • 2
  • 16

0 Answers0