1

I am trying to encode a video with AVFoundation for a video colour app (you can check it out here). Export codec will be H264 or ProRes or HEVC (depending on user choice), if that is any use for helping with my problem.

I have written my code based on answer: Make movie file with picture Array and song file, using AVAsset

And this code from github: https://github.com/caferrara/img-to-video/blob/master/img-to-video/ViewController.m#L29

Here's what I've got (There are some foreign variables and function calls, but I think it's quite clear what's going on):

NSURL * outURL = [NSURL fileURLWithPath:[NSString stringWithFormat: @"%s", path]];

NSLog(outURL.absoluteString);

NSError * error = nil;
AVAssetWriter * videoWriter = [[AVAssetWriter alloc] initWithURL: outURL
                              fileType:AVFileTypeQuickTimeMovie
                                                          error:&error];

NSDictionary * videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                               encoder->codec, AVVideoCodecKey,
                               [NSNumber numberWithInt:encoder->width], AVVideoWidthKey,
                               [NSNumber numberWithInt:encoder->height], AVVideoHeightKey,
                               nil ];

AVAssetWriterInput * videoWriterInput = [AVAssetWriterInput
                                        assetWriterInputWithMediaType:AVMediaTypeVideo
                                        outputSettings:videoSettings];


AVAssetWriterInputPixelBufferAdaptor * adaptor = [AVAssetWriterInputPixelBufferAdaptor
                                                 assetWriterInputPixelBufferAdaptorWithAssetWriterInput:videoWriterInput
                                                 sourcePixelBufferAttributes:nil];

videoWriterInput.expectsMediaDataInRealTime = YES;
[videoWriter addInput:videoWriterInput];

//Start a session:
[videoWriter startWriting];
NSLog(@"Write Started");
[videoWriter startSessionAtSourceTime:kCMTimeZero];


//Video encoding

CVPixelBufferRef buffer = NULL;

setMlvAlwaysUseAmaze(App->videoMLV);

for(uint64_t f = 0; f < getMlvFrames(App->videoMLV); ++f)
{
    getMlvProcessedFrame16(App->videoMLV, f, encoder->data);
    CVReturn success = CVPixelBufferCreateWithBytes( kCFAllocatorDefault,
                                                     encoder->width,
                                                     encoder->height,
                                                     kCVPixelFormatType_48RGB,
                                                     encoder->data,
                                                     sizeof(uint16_t) * encoder->width * 3,
                                                     NULL,
                                                     NULL,
                                                     NULL,
                                                     &buffer );
    if (success != kCVReturnSuccess || buffer == NULL) NSLog(@"Failed to create pixel buffer.");
    NSDictionary * colour_attachment = @{(id)kCVImageBufferICCProfileKey : (id)encoder->colour_profile_data};
    CVBufferSetAttachments(buffer, (CFDictionaryRef)colour_attachment, kCVAttachmentMode_ShouldPropagate);

    BOOL append_ok = NO;
    int j = 0;
    while (!append_ok && j < 30) {
        if (adaptor.assetWriterInput.readyForMoreMediaData)  {
            //print out status:
            NSLog(@"Processing video frame (%d, attempt %d)", (int)f, j);

            CMTime frameTime = CMTimeMake(f * 5000.0, (int32_t)(encoder->fps * 1000.0));
            append_ok = [adaptor appendPixelBuffer:buffer withPresentationTime:frameTime];
            if(!append_ok){
                NSError *error = videoWriter.error;
                if(error!=nil) {
                    NSLog(@"Unresolved error %@,%@.", error, [error userInfo]);
                }
            }
            while(!adaptor.assetWriterInput.readyForMoreMediaData) [NSThread sleepForTimeInterval:0.0001];
        }
        else {
            printf("adaptor not ready %d, %d\n", (int)f, j);
            while(!adaptor.assetWriterInput.readyForMoreMediaData) [NSThread sleepForTimeInterval:0.0001];
        }
        j++;
    }
    if (!append_ok) {
        printf("error appending image %d times %d\n, with error.", (int)f, j);
    }
}

[videoWriterInput markAsFinished];
[videoWriter finishWriting];

[videoWriterInput release];
[videoWriter release];

NSLog(@"Write Ended");

However, whenever I try to run it it prints the following error:

2017-11-15 20:54:40.532 MLV App[21801:3488295] Unresolved error Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12905), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x7fd446adeb70 {Error Domain=NSOSStatusErrorDomain Code=-12905 "(null)"}},{
NSLocalizedDescription = "The operation could not be completed";
NSLocalizedFailureReason = "An unknown error occurred (-12905)";
NSUnderlyingError = "Error Domain=NSOSStatusErrorDomain Code=-12905 \"(null)\""; }.

I have looked far and wide on the internet for the meaning of this error code, and that has been just useless.

Is there something majorly wrong with my usage of AVFoundation here?? I have tried switching to 8 bit RGB frames too, did not fix the problem.

  • Guessing: maybe `AVFoundation` likes RGBA when doing RGB, or maybe the pixel buffer attributes need to specify `kCVPixelBufferPixelFormatTypeKey`. A runnable snippet would help a lot here. – Rhythmic Fistman Nov 16 '17 at 05:18
  • https://osstatus.com/search/results?platform=all&framework=all&search=-12905 – seth Dec 10 '17 at 18:19

1 Answers1

1

Not every pixel format that is listed is actually supported by the frameworks. The one you've chosen is not supported. Stick to the standard formats like kCVPixelFormatType_32ARGB, kCVPixelFormatType_422YpCbCr8, etc.

kVTPixelTransferNotSupportedErr = -12905 via https://www.osstatus.com/

Tech Q&A regarding pixel formats supported by Core Video: https://developer.apple.com/library/content/qa/qa1501/_index.html

Note that you may want to run the code to find the latest list of supported formats on a given OS version. Also, not every format may be supported by every path in every framework. In other words, VideoToolbox, which does the conversion between pixel formats, may not support a format X even though you can create a CVPixelBuffer in that format.

In your case, try kCVPixelFormatType_4444AYpCbCr16, kCVPixelFormatType_422YpCbCr16, kCVPixelFormatType_422YpCbCr10, or kCVPixelFormatType_64ARGB, as they're at least supported on the decode side: https://developer.apple.com/documentation/avfoundation/avassetreadertrackoutput

seth
  • 1,647
  • 1
  • 12
  • 20