I'm currently creating a simple application which uses AVFoundation to stream video into a UIImageView
.
To achieve this, I created an instance of AVCaptureSession()
and an AVCaptureSessionPreset()
:
let input = try AVCaptureDeviceInput(device: device)
print(input)
if (captureSession.canAddInput(input)) {
captureSession.addInput(input)
if (captureSession.canAddOutput(sessionOutput)) {
captureSession.addOutput(sessionOutput)
previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.videoGravity = AVLayerVideoGravityResizeAspectFill
previewLayer.connection.videoOrientation = AVCaptureVideoOrientation.portrait
cameraView.layer.addSublayer(previewLayer)
captureSession.startRunning()
cameraView references to the
UIImageView
outlet.
I now want to implement a way of capturing a still image from the AVCaptureSession.
Correct me if theres a more efficient way, but I plan to have an additional UIImageView
to hold the still image placed on top of the UIImageView
which holds the video?
I've created a button with action:
@IBAction func takePhoto(_sender: Any) {
// functionality to obtain still image
}
My issue is, I'm unsure how to actually obtain a still image from the capture session and populate the new UIImageView
with it.
After looking at information/questions posted on Stack, the majority of the solutions is to use:
captureStillImageAsynchronouslyFromConnection
I'm unsure if it's just Swift 3.0 but xCode isn't recognising this function.
Could someone please advise me on how to actually achieve the result of obtaining and displaying a still image upon button click.
Here is a link to my full code for better understanding of my program.
Thank you all in advance for taking the time to read my question and please feel free to tell me in case i've missed out some relevant data.