2

I'm having some trouble with PBJVision.

I integrated it into a Swift project and when I'm trying to set the frame of the preview layer, it results in an inconsistent frame: img_1678

I have this code:

var _previewLayer: AVCaptureVideoPreviewLayer = PBJVision.sharedInstance().previewLayer
@IBOutlet weak var previewView: UIView! // this is the white view in the back

let bounds = self.previewView.layer.bounds
_previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
_previewLayer.bounds = bounds
_previewLayer.position = CGPointMake(CGRectGetMidX(bounds), CGRectGetMidY(bounds))
previewView.layer.addSublayer(_previewLayer)
JAL
  • 41,701
  • 23
  • 172
  • 300
Dani Pralea
  • 4,545
  • 2
  • 31
  • 49
  • Where you put this code? Are you build your interface using autolayout? If so, and you use this code before layout is ready you get wrong size in `bounds` variable. you could try to put it inside `viewDidLayoutSubviews` method – rkyr Feb 01 '16 at 08:36

2 Answers2

4

Have you tried resizing the frame of the _previewLayer when the previewView lays out its subviews?

Subclass UIView for your previewView, set a property for your AVCaptureVideoPreviewLayer, and set its frame to to match your view's bounds:

class PreviewView: UIView {

    let previewLayer: AVCaptureVideoPreviewLayer?

    override func layoutSubviews() {
        self.previewLayer?.frame = self.bounds;
    }
}
JAL
  • 41,701
  • 23
  • 172
  • 300
0

I had the same issue.

I tried @rkyr recommendation (setting the frame in the viewDidLayoutSubviews method), with no success. I also wanted to fix it without adding subclasses.

I was able to fix it with 2 changes: - Ensure that the previewLayer is defined as AVCaptureVideoPreviewLayer (subclass of CALayer) instead of CALayer, that was my case

var previewLayer: AVCaptureVideoPreviewLayer? // :CALayer
  • Ensuring that the frame of the layer is set, and it is added to the view after all the settings of the layer session have been set: This code works:

    captureSession = AVCaptureSession()
    let captureDevice = AVCaptureDevice.default(for: AVMediaType.video)
    do {
        let input = try AVCaptureDeviceInput(device: captureDevice!)
        captureSession?.addInput(input)
    } catch {
        onErrorCloseAndReturn(error: error)
    }
    let dataOutput = AVCaptureVideoDataOutput()
    dataOutput.videoSettings = [((kCVPixelBufferPixelFormatTypeKey as NSString) as String): NSNumber(value: kCVPixelFormatType_32BGRA)]
    dataOutput.alwaysDiscardsLateVideoFrames = true
    if captureSession!.canAddOutput(dataOutput) {
        captureSession?.addOutput(dataOutput)
    }
    captureSession?.commitConfiguration()
    let queue = DispatchQueue(label: "captureQueue")
    dataOutput.setSampleBufferDelegate(self, queue: queue)
    captureSession?.startRunning()
    
    previewLayer = AVCaptureVideoPreviewLayer(session: captureSession!)
    previewLayer?.frame = self.view.layer.bounds
    self.view.layer.addSublayer(previewLayer!)
    

I hoe it helps... e

eharo2
  • 2,553
  • 1
  • 29
  • 39