0

I am trying to create a AVAssetWriter to screen capture an openGL project. I have never written a AVAssetWriter or an AVAssetWriterInputPixelBufferAdaptor so I am not sure if I did anything correctly.

- (id) initWithOutputFileURL:(NSURL *)anOutputFileURL {
    if ((self = [super init])) {
        NSError *error;
        movieWriter = [[AVAssetWriter alloc] initWithURL:anOutputFileURL fileType:AVFileTypeMPEG4 error:&error];
        NSDictionary *videoSettings = [NSDictionary dictionaryWithObjectsAndKeys:
                                       AVVideoCodecH264, AVVideoCodecKey,
                                       [NSNumber numberWithInt:640], AVVideoWidthKey,
                                       [NSNumber numberWithInt:480], AVVideoHeightKey,
                                       nil];
        writerInput = [[AVAssetWriterInput
                        assetWriterInputWithMediaType:AVMediaTypeVideo
                        outputSettings:videoSettings] retain];
        writer = [[AVAssetWriterInputPixelBufferAdaptor alloc] initWithAssetWriterInput:writerInput sourcePixelBufferAttributes:[NSDictionary dictionaryWithObjectsAndKeys:[NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,nil]];

        [movieWriter addInput:writerInput];
        writerInput.expectsMediaDataInRealTime = YES;
    }

    return self;
}

Other parts of the class:

- (void)getFrame:(CVPixelBufferRef)SampleBuffer:(int64_t)frame{
    frameNumber = frame;
    [writer appendPixelBuffer:SampleBuffer withPresentationTime:CMTimeMake(frame, 24)]; 
}

- (void)startRecording {
   [movieWriter startWriting];
   [movieWriter startSessionAtSourceTime:kCMTimeZero];
}

- (void)stopRecording {
   [writerInput markAsFinished];
   [movieWriter endSessionAtSourceTime:CMTimeMake(frameNumber, 24)];
   [movieWriter finishWriting];
}

The assetwriter is initiated by:

    NSURL *outputFileURL = [NSURL fileURLWithPath:[NSString stringWithFormat:@"%@%@", NSTemporaryDirectory(), @"output.mov"]];
    recorder = [[GLRecorder alloc] initWithOutputFileURL:outputFileURL];

The view is recorded this way:

    glReadPixels(0, 0, 480, 320, GL_RGBA, GL_UNSIGNED_BYTE, buffer);
    for(int y = 0; y <320; y++) {
    for(int x = 0; x <480 * 4; x++) {
        int b2 = ((320 - 1 - y) * 480 * 4 + x);
        int b1 = (y * 4 * 480 + x);
        buffer2[b2] = buffer[b1];
    }
}    
pixelBuffer = NULL;
CVPixelBufferCreateWithBytes (NULL,480,320,kCVPixelFormatType_32BGRA,buffer2,1920,NULL,0,NULL,&pixelBuffer);
[recorder getFrame:pixelBuffer :framenumber];
    framenumber++;

Note:

pixelBuffer is a CVPixelBufferRef.
framenumber is an int64_t.
buffer and buffer2 are GLubyte.

I get no errors but when I finish recording there is no file. Any help or links to help would greatly be appreciated. The opengl has from live feed from the camera. I've been able to save the screen as a UIImage but want to get a movie of what I created.

SnowJack
  • 21
  • 9

1 Answers1

0

If you're writing RGBA frames, I think you may need to use a AVAssetWriterInputPixelBufferAdaptor to write them out. This class is supposed to manage a pool of pixel buffers, but I get the impression that it actually massages your data into YUV.

If that works, then I think you'll find that your colours are all swapped at which point you'll probably have to write pixel shader to convert them to BGRA. Or (shudder) do it on the CPU. Up to you.

Rhythmic Fistman
  • 34,352
  • 5
  • 87
  • 159
  • I think I am messaging RGBA data but I could be wrong. I am using a AVAssetWriterInputPixelBufferAdaptor. I think the problem is that I am not sending it a pool of pixel buffers. I wouldn't know how to go about doing that. – SnowJack Nov 11 '11 at 15:54
  • If you're using AVAssetWriterInputPixelBufferAdaptor then you should update your question. – Rhythmic Fistman Nov 11 '11 at 16:05