28

I am using AVFoundation and getting the sample buffer from AVCaptureVideoDataOutput, I can write it directly to videoWriter by using:

- (void)writeBufferFrame:(CMSampleBufferRef)sampleBuffer {
    CMTime lastSampleTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer);    
    if(self.videoWriter.status != AVAssetWriterStatusWriting)
    {
        [self.videoWriter startWriting];
        [self.videoWriter startSessionAtSourceTime:lastSampleTime];
    }

    [self.videoWriterInput appendSampleBuffer:sampleBuffer];

}

What I want to do now is to crop and scale the image inside the CMSampleBufferRef without converting it into UIImage or CGImageRef because that slows down the performance.

vodkhang
  • 18,639
  • 11
  • 76
  • 110

5 Answers5

36

If you use vimage you can work directly on the buffer data without converting it to any image format.

outImg contains the cropped and scaled image data. The relation between outWidth and cropWidth sets the scaling. vimage cropping

int cropX0, cropY0, cropHeight, cropWidth, outWidth, outHeight;

CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);                   
CVPixelBufferLockBaseAddress(imageBuffer,0);
void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);
                                    
vImage_Buffer inBuff;                       
inBuff.height = cropHeight;
inBuff.width = cropWidth;
inBuff.rowBytes = bytesPerRow;

int startpos = cropY0*bytesPerRow+4*cropX0;
inBuff.data = baseAddress+startpos;

unsigned char *outImg= (unsigned char*)malloc(4*outWidth*outHeight);
vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4*outWidth};

vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
if (err != kvImageNoError) NSLog(@" error %ld", err);

So setting cropX0 = 0 and cropY0 = 0 and cropWidth and cropHeight to the original size means no cropping (using the whole original image). Setting outWidth = cropWidth and outHeight = cropHeight results in no scaling. Note that inBuff.rowBytes should always be the length of the full source buffer, not the cropped length.

Vukašin Manojlović
  • 3,717
  • 3
  • 19
  • 31
Sten
  • 3,624
  • 1
  • 27
  • 26
  • Hi Sten, I searched the guide for a cropping example but couldn't fine one, can u give an example how to crop the buffer directly? – Eyal Jun 17 '13 at 11:12
  • vImage has no cropping functionality – AlexeyVMP Mar 11 '14 at 16:58
  • It is correct that vimage doesn't have a dedicated cropping function. But by setting the size of the buffer you can apply a crop together with essentially any function, e.g. a scale. I have updated my answer with an example – Sten Mar 17 '14 at 19:23
  • Thank you so much man for this!! i spent the last day trying to figure out why i am getting a distorted cropped image. It turned out that the inBuff.rowBytes value i was setting was incorrect. I was recalculating it for the crop area rather than using the original one for the buffer – Abolfoooud Aug 08 '14 at 12:39
  • 3
    I know this question/answer is old, but how do I get the outBuff now to a CMSampleBuffer? – Nils Ziehn Mar 03 '15 at 23:57
  • 4
    Nils, the data of the cropped file is in outImg. So you can use that to create a pixelbuffer e.g. with CVPixelBufferCreateWithBytes. Then you can use that to create the CMSampleBuffer. – Sten Mar 04 '15 at 08:21
  • Hi @Sten , sorry to hijack this question but, I got a couple of questions related to this. First, if i am using the 'captureStillImageAsynchronouslyFromConnection...' call to get a picture which i want to crop and rotate would it be more efficient to use vImage or CoreImage? (The end result should be an UIImage) But the process has to be very efficient because the system is killing my app for using too much memory too fast. Also, can i rotate and crop in a single operation using vImage? Thank you. – Pochi Sep 22 '15 at 00:31
  • @Chiquis, to rotate and crop use the code above but use vImageRotate_ARGB8888 instead of vImageScale_ARGB8888. (Sorry for slow reply) – Sten Oct 25 '15 at 08:53
  • Hi, the "vImage Programming Guide" is not available in PDF format any more, and there is no "p10-12". Searching through the online version for "crop" didn't uncover anything. Has this been removed, or did I misunderstand the RTFM suggestion? Thanks – Jean-Denis Muys Mar 14 '16 at 17:42
  • 1
    I have added an image instead of the link to the old PDF-guide. – Sten Mar 16 '16 at 07:55
  • @Sten the vImage operation is scrambling my image. Even when I set all the parameters so that it's simply a pass through, all inputs sizes = output sizes. can't seem to make sense of this. – Gukki5 Oct 29 '17 at 20:58
  • @Gukki5 It is difficult to say without seeing your code. Are you sure that you are using the right row length and encoding when you convert the outImg to an image. Also, the code above assumes that the buffer is rgb not yuv. – Sten Oct 31 '17 at 15:39
  • @Sten basically i was using it to crop 4:3 aspect ratio to 16:9. So constant height, just shrinking the width. I was calculating the correct bytes per row for the resulting vImage_Buffer... but when I tried to use that for the input vImage_Buffer as well.. it was getting scrambled. When I switched to using the actual bytes per row of the input, it outputted fine. Haha now my issue is trying to properly get it into a CVPixelBuffer. – Gukki5 Oct 31 '17 at 15:48
  • @Sten i'd thought the input bytes per row needed to correspond to the reduced image i was inputting, not the full image... since I was explicitly discarding some of it. – Gukki5 Oct 31 '17 at 15:57
  • @Gukki5 The buffer is one long list of data so it needs the length of full image to know where "row breaks" are. To get a CVPixelBuffer you can try CVPixelBufferCreateWithBytes – Sten Oct 31 '17 at 17:27
  • This code does not work/compile in iOS, the line "inBuff.data = baseAddress+startpos;" gives the compile error "arithmetic on pointer to void" and on "vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4*outWidth};" the error "non-constant expression cannot be narrowed" – James Mar 06 '18 at 14:49
  • I am getting distorted output video, can any one help me.. I am using equivalent swift code. var srcBuffer = vImage_Buffer(data: baseAddress! + startpos, height: UInt(destSize.height), width: UInt(destSize.width), rowBytes: Int(bytesPerRow)) let outImg = UnsafeMutablePointer.allocate(capacity: 4 * Int(destSize.width) * Int(destSize.height)) var destBuffer: vImage_Buffer = vImage_Buffer(data: outImg, height: vImagePixelCount(destSize.height), width: vImagePixelCount(destSize.width), rowBytes: 4 * Int(destSize.width)) – jpulikkottil Jun 25 '19 at 13:34
  • I'm trying to record a portion of the screen and I'm using the same code. However, it doesn't seem to work. Here's my question on SO https://stackoverflow.com/questions/60904676/need-help-in-screen-recording-a-part-of-the-screen-in-ios. I've uploaded full source to git hub. Can someone help? – Felix Marianayagam Mar 28 '20 at 19:55
  • I used this logic to scale the image to smaller size without cropping. I feel that the scaled image has poor quality. Did anyone else face this issue? – prabhu Jul 31 '20 at 15:55
  • The last parameter in the vImageScale call is a flag where you can set "highQualityResampling" (check out the documentation) – Sten Aug 01 '20 at 16:15
  • @Sten, I am trying to implement the same in Swift, I am a bit stuck at converting the vImageBuffer to CVPixelbuffer. Do you know how can we achieve that? – Satyen Udeshi Sep 02 '20 at 13:53
11

Note: I didn't notice that the original question also requested scaling. But anyways, for those who simply needs to crop CMSampleBuffer, here's the solution.

The buffer is simply an array of pixels, so you can actually process the buffer directly without using vImage. Code is written in Swift, but I think it's easy to find the Objective-C equivalent.

First, make sure your CMSampleBuffer is BGRA format. If not, the preset you use is probably YUV, and ruin the bytes per rows that will later be used.

dataOutput = AVCaptureVideoDataOutput()
dataOutput.videoSettings = [
    String(kCVPixelBufferPixelFormatTypeKey): 
    NSNumber(value: kCVPixelFormatType_32BGRA)
]

Then, when you get the sample buffer:

let imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer)!

CVPixelBufferLockBaseAddress(imageBuffer, .readOnly)

let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
let bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer)
let cropWidth = 640
let cropHeight = 640
let colorSpace = CGColorSpaceCreateDeviceRGB()

let context = CGContext(data: baseAddress, width: cropWidth, height: cropHeight, bitsPerComponent: 8, bytesPerRow: bytesPerRow, space: colorSpace, bitmapInfo: CGImageAlphaInfo.noneSkipFirst.rawValue | CGBitmapInfo.byteOrder32Little.rawValue)
// now the cropped image is inside the context. 
// you can convert it back to CVPixelBuffer 
// using CVPixelBufferCreateWithBytes if you want.

CVPixelBufferUnlockBaseAddress(imageBuffer, .readOnly)

// create image
let cgImage: CGImage = context!.makeImage()!
let image = UIImage(cgImage: cgImage)

If you want to crop from some specific position, add the following code:

// calculate start position
let bytesPerPixel = 4
let startPoint = [ "x": 10, "y": 10 ]
let startAddress = baseAddress + startPoint["y"]! * bytesPerRow + startPoint["x"]! * bytesPerPixel

and change the baseAddress in CGContext() into startAddress. Make sure not to exceed the origin image width and height.

Jeroen
  • 2,011
  • 2
  • 26
  • 50
yuji
  • 715
  • 7
  • 21
  • 5
    "...without converting it into UIImage or CGImageRef because that slows down the performance." – vkalit Nov 21 '15 at 00:44
  • 2
    I did "crop and scale the image inside the CMSampleBufferRef without converting it into UIImage or CGImageRef ". I just save it as CGImageRef for further usage (eg. show on screen). You can do whatever you want with the cropped context. – yuji Nov 23 '15 at 11:48
  • Hi 黃昱嘉 what is the best way to contact you? Would like to ask a quick question. Thanks! – Crashalot Feb 07 '16 at 23:57
  • Works great! I am using this code to convert a 4:3 sample buffer to 16:9 with swift 2.2. Huge thanks! – Ivan Lesko Feb 16 '16 at 00:11
  • Update for Swift 3? It looks like the `CGBitmapContextCreate` function no longer exists... – HighFlyingFantasy Jan 03 '17 at 16:32
  • @HighFlyingFantasy Sorry for the late response. I've edited my answer. – yuji Jan 12 '17 at 03:30
  • 1
    and how to convert CIImage back to CMSampleBuffer fast? – user924 Mar 02 '18 at 11:50
  • Hi, @yuji thank you for your answer! Could you please show how to recreate CVPixelBuffer? there is a lot of params I don't understand when trying to do it. And I would be super grateful if you could advice anything to read about all this magic – Leonid Silver Apr 06 '21 at 15:59
9

You might consider using CoreImage (5.0+).

CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)
                                           options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]];
ciImage = [[ciImage imageByApplyingTransform:myScaleTransform] imageByCroppingToRect:myRect];
Cliff
  • 10,586
  • 7
  • 61
  • 102
2

Try this on Swift3

func resize(_ destSize: CGSize)-> CVPixelBuffer? {
        guard let imageBuffer = CMSampleBufferGetImageBuffer(self) else { return nil }
        // Lock the image buffer
        CVPixelBufferLockBaseAddress(imageBuffer, CVPixelBufferLockFlags(rawValue: 0))
        // Get information about the image
        let baseAddress = CVPixelBufferGetBaseAddress(imageBuffer)
        let bytesPerRow = CGFloat(CVPixelBufferGetBytesPerRow(imageBuffer))
        let height = CGFloat(CVPixelBufferGetHeight(imageBuffer))
        let width = CGFloat(CVPixelBufferGetWidth(imageBuffer))
        var pixelBuffer: CVPixelBuffer?
        let options = [kCVPixelBufferCGImageCompatibilityKey:true,
                       kCVPixelBufferCGBitmapContextCompatibilityKey:true]
        let topMargin = (height - destSize.height) / CGFloat(2)
        let leftMargin = (width - destSize.width) * CGFloat(2)
        let baseAddressStart = Int(bytesPerRow * topMargin + leftMargin)
        let addressPoint = baseAddress!.assumingMemoryBound(to: UInt8.self)
        let status = CVPixelBufferCreateWithBytes(kCFAllocatorDefault, Int(destSize.width), Int(destSize.height), kCVPixelFormatType_32BGRA, &addressPoint[baseAddressStart], Int(bytesPerRow), nil, nil, options as CFDictionary, &pixelBuffer)
        if (status != 0) {
            print(status)
            return nil;
        }
        CVPixelBufferUnlockBaseAddress(imageBuffer,CVPixelBufferLockFlags(rawValue: 0))
        return pixelBuffer;
    }
Bluewings
  • 3,438
  • 3
  • 18
  • 31
wu qiuhao
  • 39
  • 2
1

For scaling you can have AVFoundation do this for you. See my recent post here. Setting the value for AVVideoWidth/AVVideoHeight key will scale the images if they are not the same dimensions. Take a look at the properties here.As for cropping I am not sure if you can have AVFoundation do this for you. You may have to resort to using OpenGL or CoreImage. There are a couple of good links in the top post for this SO question.

Community
  • 1
  • 1
Steve McFarlin
  • 3,576
  • 1
  • 25
  • 24
  • I can make it automatically scale for me, but it keeps complaining me running out of memory, as you can look at my newest post here http://stackoverflow.com/questions/8561456/ios-automatically-resize-cvpixelbufferref . It seems that reason is I keep changing the size – vodkhang Dec 25 '11 at 16:25