The problem I am facing is that the image taken from the camera is larger then the one shown in the live view. I have the camera view setup as Aspect Fill.
So the image that I get from the camera is about 4000x3000 and the view that shows the live feed from the camera is 375x800 (fullscreen iPhoneX size) so how do I transform/cut out part of the image from the image gotten from the camera to be the same as the one shown in the live view, so I can further manipulate the image (draw over it).
As far as I understand the Aspect Fill property clips the image that cannon't be shown in the view. But that clip does not happen on X = 0 and y = 0 it happens somewhere in the middle of the image. So how do i get that X and Y on the original image so that i can crop out exactly that part out.
I hope I explained well enough.
EDIT:
To give more context and some code snipets to make it easier to understand the issue.
Setting up my camera with the .resizeAspectFill gravity.
cameraPreviewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
cameraPreviewLayer?.videoGravity = AVLayerVideoGravity.resizeAspectFill
cameraPreviewLayer?.connection?.videoOrientation = AVCaptureVideoOrientation.portrait
cameraPreviewLayer?.frame = self.captureView.frame
self.captureView.layer.addSublayer(cameraPreviewLayer!)
which is displayed in the live view (captureView) that has the size of 375x818 (width: 375 and height: 818).
Then I get the image from that camera on button click and the size of that image is:
3024x4032 (width: 3024 and height: 4032)
So what i want to do is crop the image from the camera to be the same as the one in the live view (captureView) that is set to AspectFill type.