1

I am using a AVAssetWriter to record movie frames.

I was wondering how I would go about zooming/scaling a frame (it can be set at the beginning of the movie, it does not have to be on a per frame basis..)

I assume I should be able to do that with the CMSampleBuffer that append to by assetWriterInput

Psydocode:

CMSampleBufferRef scaledSampleBuffer = [self scale:sampleBuffer];
[_videoWriterInput appendSampleBuffer:scaledSampleBuffer];

I have found some stackoverflow questions on this:

This one should create a vImage_Buffer, but I have no idea how I get that into my _videoWriterInput:

Stackoverflow

I tried to convert it back with something like this:

    NSInteger cropX0 = 100,
        cropY0 = 100,
        cropHeight = 100,
        cropWidth = 100,
        outWidth = 480,
        outHeight = 480;

    CVImageBufferRef imageBuffer = CMSampleBufferGetImageBuffer(sampleBuffer);
    CVPixelBufferLockBaseAddress(imageBuffer,0);
    void *baseAddress = CVPixelBufferGetBaseAddress(imageBuffer);
    size_t bytesPerRow = CVPixelBufferGetBytesPerRow(imageBuffer);

    vImage_Buffer inBuff;
    inBuff.height = cropHeight;
    inBuff.width = cropWidth;
    inBuff.rowBytes = bytesPerRow;

    NSInteger startpos = cropY0*bytesPerRow+4*cropX0;
    inBuff.data = baseAddress+startpos;

    unsigned char *outImg= (unsigned char*)malloc(4*outWidth*outHeight);
    vImage_Buffer outBuff = {outImg, outHeight, outWidth, 4*outWidth};

    vImage_Error err = vImageScale_ARGB8888(&inBuff, &outBuff, NULL, 0);
    if (err != kvImageNoError) NSLog(@" error %ld", err);


    CVPixelBufferUnlockBaseAddress(imageBuffer,0);

    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreate(kCFAllocatorSystemDefault, 480, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);

    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    vImage_CGImageFormat format = {
        .bitsPerComponent = 8,
        .bitsPerPixel = 32,
        .bitmapInfo = kCGBitmapByteOrder32Little | kCGImageAlphaNoneSkipFirst,  //BGRX8888
        .colorSpace = NULL,    //sRGB
    };

    vImageBuffer_CopyToCVPixelBuffer(&outBuff,
                                     &format,
                                     pixelBuffer,
                                     NULL,
                                     NULL,
                                     kvImageNoFlags);


    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );

    CMSampleTimingInfo sampleTime = {
        .duration = CMSampleBufferGetDuration(sampleBuffer),
        .presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer),
        .decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
    };

    CMVideoFormatDescriptionRef videoInfo = NULL;
    CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo);

    CMSampleBufferRef oBuf;
    CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf);

But that doesn't work either (corrupted frames)

Copied together form other Stackoverflow articles, I came up with this code, but it results in a broken video:

    CGAffineTransform transform = CGAffineTransformMakeScale(2, 2); // zoom 2x

    CIImage *ciImage = [CIImage imageWithCVPixelBuffer:CMSampleBufferGetImageBuffer(sampleBuffer)
                                               options:[NSDictionary dictionaryWithObjectsAndKeys:[NSNull null], kCIImageColorSpace, nil]];
    ciImage = [[ciImage imageByApplyingTransform:transform] imageByCroppingToRect:CGRectMake(0, 0, 480, 480)];

    CVPixelBufferRef pixelBuffer;
    CVPixelBufferCreate(kCFAllocatorSystemDefault, 480, 480, kCVPixelFormatType_32BGRA, NULL, &pixelBuffer);

    CVPixelBufferLockBaseAddress( pixelBuffer, 0 );

    CIContext * ciContext = [CIContext contextWithOptions: nil];
    [ciContext render:ciImage toCVPixelBuffer:pixelBuffer];
    CVPixelBufferUnlockBaseAddress( pixelBuffer, 0 );

    CMSampleTimingInfo sampleTime = {
        .duration = CMSampleBufferGetDuration(sampleBuffer),
        .presentationTimeStamp = CMSampleBufferGetPresentationTimeStamp(sampleBuffer),
        .decodeTimeStamp = CMSampleBufferGetDecodeTimeStamp(sampleBuffer)
    };

    CMVideoFormatDescriptionRef videoInfo = NULL;
    CMVideoFormatDescriptionCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, &videoInfo);

    CMSampleBufferRef oBuf;
    CMSampleBufferCreateForImageBuffer(kCFAllocatorDefault, pixelBuffer, true, NULL, NULL, videoInfo, &sampleTime, &oBuf);

Any ideas?

Community
  • 1
  • 1
Nils Ziehn
  • 4,118
  • 6
  • 26
  • 40

0 Answers0