0

I am currently working on a snippet of code which looks like the following:

if error == nil && (captureSession?.canAddInput(input))!
{
    captureSession?.addInput(input)

    stillImageOutput = AVCaptureStillImageOutput()
    //let settings = AVCapturePhotoSettings()
    //settings.availablePreviewPhotoPixelFormatTypes =
    stillImageOutput?.outputSettings = [AVVideoCodecKey : AVVideoCodecJPEG]

    if (captureSession?.canAddOutput(stillImageOutput))!
    {
        captureSession?.addOutput(stillImageOutput)

        previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
        previewLayer?.videoGravity = AVLayerVideoGravityResizeAspect
        previewLayer?.connection.videoOrientation = AVCaptureVideoOrientation.portrait
        cameraView.layer.addSublayer(previewLayer!)
        captureSession?.startRunning()
    }
}

I am aware that I should be using AVCapturePhotoOutput() instead of AVCaptureStillImageOutput() but am confused as to how I can transform the rest of this block if I make that change.

Specifically, how can I apply the same settings using the commented let settings = AVCapturePhotoSettings()?

For reference, I am using this tutorial as a guide.

Thanks

Ashley Mills
  • 50,474
  • 16
  • 129
  • 160
User 5842
  • 2,849
  • 7
  • 33
  • 51

1 Answers1

2

Apple documentation explains very clear for How to use AVCapturePhotoOutput

These are the steps to capture a photo.

  • Create an AVCapturePhotoOutput object. Use its properties to determine supported capture settings and to enable certain features (for example, whether to capture Live Photos).
  • Create and configure an AVCapturePhotoSettings object to choose features and settings for a specific capture (for example, whether to enable image stabilization or flash).
  • Capture an image by passing your photo settings object to the capturePhoto(with:delegate:) method along with a delegate object implementing the AVCapturePhotoCaptureDelegate protocol. The photo capture output then calls your delegate to notify you of significant events during the capture process.

have this below code on your clickCapture method and don't forgot to confirm and implement to delegate in your class.

let settings = AVCapturePhotoSettings()
let previewPixelType = settings.availablePreviewPhotoPixelFormatTypes.first!
let previewFormat = [kCVPixelBufferPixelFormatTypeKey as String: previewPixelType,
                             kCVPixelBufferWidthKey as String: 160,
                             kCVPixelBufferHeightKey as String: 160,
                             ]
settings.previewPhotoFormat = previewFormat
self.cameraOutput.capturePhoto(with: settings, delegate: self)

if you would like to know the different way to capturing photo from avfoundation check out my previous SO answer

Community
  • 1
  • 1
Bluewings
  • 3,438
  • 3
  • 18
  • 31
  • This is the answer that I have been seeing for some time but am still not sure how to implement it. For example, my code snippet chose a codec for JPEG. Where can I do the same with this code? – User 5842 Jan 02 '17 at 16:52
  • 2
    when init for `AVCapturePhotoSettings` you have option to init with format which is a dictionary. You can set the AVVideoCodecKey in that dictionary. links [here](https://developer.apple.com/reference/avfoundation/avcapturephotosettings) [here](https://developer.apple.com/reference/avfoundation/avcapturephotosettings/1648673-init) and [here](https://developer.apple.com/reference/avfoundation/avcapturephotosettings/1648783-format). It is always best to check the apple documentations for details – Bluewings Jan 03 '17 at 11:23