I have a simple AVCaptureSession running to get a camera feed in my app and take photos. How can I implement the 'pinch to zoom' functionality using a UIGestureRecognizer
for the camera?

- 3,158
- 5
- 30
- 67
-
Where are you adding pinch gesture? – Rahul Oct 01 '17 at 18:25
9 Answers
The accepted answer is actually outdated and I'm not sure it will actually take the photo of the zoomed in image. There is a method to zoom in like bcattle answer says. The problem of his answer is that it does not take in charge the fact that the user can zoom in and then restart from that zoom position. His solution will create some kind of jumps that are not really elegant.
The easiest and most elegant way of doing this is to use the velocity of the pinch gesture.
-(void) handlePinchToZoomRecognizer:(UIPinchGestureRecognizer*)pinchRecognizer {
const CGFloat pinchVelocityDividerFactor = 5.0f;
if (pinchRecognizer.state == UIGestureRecognizerStateChanged) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
CGFloat desiredZoomFactor = device.videoZoomFactor + atan2f(pinchRecognizer.velocity, pinchVelocityDividerFactor);
// Check if desiredZoomFactor fits required range from 1.0 to activeFormat.videoMaxZoomFactor
device.videoZoomFactor = MAX(1.0, MIN(desiredZoomFactor, device.activeFormat.videoMaxZoomFactor));
[videoDevice unlockForConfiguration];
} else {
NSLog(@"error: %@", error);
}
}
}
I found that adding the arctan function to the velocity will ease the zoom in zoom out effect a bit. It is not exactly perfect but the effect is good enough for the needs. There could probably be another function to ease the zoom out when it almost reaches 1.
NOTE: Also, the scale of a pinch gesture goes from 0 to infinite with 0 to 1 being pinching in (zoom out) and 1 to infinite being pinching out (zoom in). To get a good zoom in zoom out effect with this you'd need to have a math equation. Velocity is actually from -infinite to infinite with 0 being the starting point.
EDIT: Fixed crash on range exception. Thanks to @garafajon!

- 1
- 1

- 1,734
- 20
- 22
-
4Thanks. Use this to not get a range exception: CGFloat desiredZoom = videoDevice.videoZoomFactor + atan(pinchRecognizer.velocity / pinchZoomScaleFactor); videoDevice.videoZoomFactor = MAX(1.0, MIN(desiredZoom,videoDevice.activeFormat.videoMaxZoomFactor)); – garafajon Oct 01 '15 at 18:32
-
1Be aware that the velocity can return nan. You might want to check that before moving on with calculation: if (isnan(pinchRecognizer.velocity)) { return; } – Masa Mar 15 '16 at 08:54
-
-
-
Please, review my answer to get an easier way to handle camera zoom level with pinch recognizer https://stackoverflow.com/a/58704702/1705508 – Anton K Nov 05 '19 at 04:52
Swift 4
Add a pinch gesture recognizer to the front-most view and connect it to this action (pinchToZoom). captureDevice should be the instance currently providing input to the capture session. pinchToZoom provides smooth zooming for both front&back capture devices.
@IBAction func pinchToZoom(_ pinch: UIPinchGestureRecognizer) {
guard let device = captureDevice else { return }
func minMaxZoom(_ factor: CGFloat) -> CGFloat { return min(max(factor, 1.0), device.activeFormat.videoMaxZoomFactor) }
func update(scale factor: CGFloat) {
do {
try device.lockForConfiguration()
defer { device.unlockForConfiguration() }
device.videoZoomFactor = factor
} catch {
debugPrint(error)
}
}
let newScaleFactor = minMaxZoom(pinch.scale * zoomFactor)
switch sender.state {
case .began: fallthrough
case .changed: update(scale: newScaleFactor)
case .ended:
zoomFactor = minMaxZoom(newScaleFactor)
update(scale: zoomFactor)
default: break
}
}
It'll be useful to declare zoomFactor on your camera or vc. I usually put it on the same singleton that has AVCaptureSession. This will act as a default value for captureDevice's videoZoomFactor.
var zoomFactor: Float = 1.0

- 1,182
- 12
- 12
-
Hi, a variable you defined here is unclear to me. As a beginner. Can you tell me what the "pinch" variable is in your code? Thanks! – Karthik Kannan Mar 21 '18 at 12:54
-
1Just looking at this again, I think that pinch should be sender. Or sender parameter can be pinch. – jnblanchard Mar 21 '18 at 15:19
-
Many have tried to do this by setting the transform property on the layer to CGAffineTransformMakeScale(gesture.scale.x, gesture.scale.y);
See here for a full fledged implementation of pinch-to-zoom.

- 43,043
- 8
- 107
- 153
-
Thanks for the response. But how does merely changing the scale of the video preview view actually change the zoom of the camera hardware? – The Kraken Apr 19 '12 at 03:12
-
It doesn't. That's why even apple's camera doesn't truly 'zoom'. It's just some CGAffines, and some fancy cropping. – CodaFi Apr 19 '12 at 03:17
-
Right, the camera just uses a "digital zoom". But what else must I do outside of scaling the video preview view to actually make the image being written to disk "zoomed"? – The Kraken Apr 19 '12 at 03:42
-
3See [here](http://stackoverflow.com/a/8166411/945847) for that. Just know that no iOS device has hardware zoom, so this requires a bit more math than the post I linked to's answer. – CodaFi Apr 19 '12 at 03:49
-
Thanks a lot. I noticed the answer to the linked question said the app crashed when the zoom was maximized due to memory. Is this something I'll need to worry about under ARC? – The Kraken Apr 19 '12 at 03:54
-
-
Since iOS 7 there is actually a method to zoom in the image like @bcattle said. The problem with his solution is that it will jump in between the pinch. See my answer for a smoother and proper way of doing this. – Gabriel Cartier Jul 03 '15 at 19:09
-
Since iOS 7 you can set the zoom directly with the videoZoomFactor
property of AVCaptureDevice
.
Tie the scale
property of the UIPinchGestureRecognizer
to thevideoZoomFactor
with a scaling constant. This will let you vary the sensitivity to taste:
-(void) handlePinchToZoomRecognizer:(UIPinchGestureRecognizer*)pinchRecognizer {
const CGFloat pinchZoomScaleFactor = 2.0;
if (pinchRecognizer.state == UIGestureRecognizerStateChanged) {
NSError *error = nil;
if ([videoDevice lockForConfiguration:&error]) {
videoDevice.videoZoomFactor = 1.0 + pinchRecognizer.scale * pinchZoomScaleFactor;
[videoDevice unlockForConfiguration];
} else {
NSLog(@"error: %@", error);
}
}
}
Note that AVCaptureDevice
, along everything else related to AVCaptureSession
, is not thread safe. So you probably don't want to do this from the main queue.

- 12,115
- 6
- 62
- 82
In swift version, you can zoom in/out by simply passing scaled number on videoZoomFactor. Following code in UIPinchGestureRecognizer handler will solve the issue.
do {
try device.lockForConfiguration()
switch gesture.state {
case .began:
self.pivotPinchScale = device.videoZoomFactor
case .changed:
var factor = self.pivotPinchScale * gesture.scale
factor = max(1, min(factor, device.activeFormat.videoMaxZoomFactor))
device.videoZoomFactor = factor
default:
break
}
device.unlockForConfiguration()
} catch {
// handle exception
}
In here, pivotPinchScale is a CGFloat property that declared in your controller somewhere.
You may also refer to following project to see how camera works with UIPinchGestureRecognizer. https://github.com/DragonCherry/CameraPreviewController

- 742
- 7
- 10
I started from the @Gabriel Cartier's solution (thanks). In my code I've preferred to use the smoother rampToVideoZoomFactor and a simpler way to compute the device's scale factor.
(IBAction) pinchForZoom:(id) sender forEvent:(UIEvent*) event {
UIPinchGestureRecognizer* pinchRecognizer = (UIPinchGestureRecognizer *)sender;
static CGFloat zoomFactorBegin = .0;
if ( UIGestureRecognizerStateBegan == pinchRecognizer.state ) {
zoomFactorBegin = self.captureDevice.videoZoomFactor;
} else if (UIGestureRecognizerStateChanged == pinchRecognizer.state) {
NSError *error = nil;
if ([self.captureDevice lockForConfiguration:&error]) {
CGFloat desiredZoomFactor = zoomFactorBegin * pinchRecognizer.scale;
CGFloat zoomFactor = MAX(1.0, MIN(desiredZoomFactor, self.captureDevice.activeFormat.videoMaxZoomFactor));
[self.captureDevice rampToVideoZoomFactor:zoomFactor withRate:3.0];
[self.captureDevice unlockForConfiguration];
} else {
NSLog(@"error: %@", error);
}
}
}

- 93
- 9
There is an easier way to handle camera zoom level with pinch recognizer. The only thing you need to do is take cameraDevice.videoZoomFactor
and set it to the recognizer on .began
state like this
@objc private func viewPinched(recognizer: UIPinchGestureRecognizer) {
switch recognizer.state {
case .began:
recognizer.scale = cameraDevice.videoZoomFactor
case .changed:
let scale = recognizer.scale
do {
try cameraDevice.lockForConfiguration()
cameraDevice.videoZoomFactor = max(cameraDevice.minAvailableVideoZoomFactor, min(scale, cameraDevice.maxAvailableVideoZoomFactor))
cameraDevice.unlockForConfiguration()
}
catch {
print(error)
}
default:
break
}
}

- 326
- 1
- 13
based on @Gabriel Cartier 's answer :
- (void) cameraZoomWithPinchVelocity: (CGFloat)velocity {
CGFloat pinchVelocityDividerFactor = 40.0f;
if (velocity < 0) {
pinchVelocityDividerFactor = 5.; //zoom in
}
if (_videoInput) {
if([[_videoInput device] position] == AVCaptureDevicePositionBack) {
NSError *error = nil;
if ([[_videoInput device] lockForConfiguration:&error]) {
CGFloat desiredZoomFactor = [_videoInput device].videoZoomFactor + atan2f(velocity, pinchVelocityDividerFactor);
// Check if desiredZoomFactor fits required range from 1.0 to activeFormat.videoMaxZoomFactor
CGFloat maxFactor = MIN(10, [_videoInput device].activeFormat.videoMaxZoomFactor);
[_videoInput device].videoZoomFactor = MAX(1.0, MIN(desiredZoomFactor, maxFactor));
[[_videoInput device] unlockForConfiguration];
} else {
NSLog(@"cameraZoomWithPinchVelocity error: %@", error);
}
}
}
}

- 31
- 4
I am using iOS SDK 8.3 and the AVfoundation framework and for me using the following method worked for :
nameOfAVCaptureVideoPreviewLayer.affineTransform = CGAffineTransformMakeScale(scaleX, scaleY)
For saving the picture with the same scale I used the following method:
nameOfAVCaptureConnection.videoScaleAndCropFactor = factorNumber;
The code bellow is for getting the image in the scale
[stillImageOutput captureStillImageAsynchronouslyFromConnection:videoConnnection completionHandler:^(CMSampleBufferRef imageDataSampleBuffer, NSError *error) {
if(imageDataSampleBuffer != NULL){
NSData *imageData = [AVCaptureStillImageOutput jpegStillImageNSDataRepresentation:imageDataSampleBuffer];
UIImage *image = [UIImage imageWithData:imageData];
}
}];