2

I'm trying to record the screen of the user action from a GLKView, my video file is here, I have the correct length but it's show only a black screen.

I've subclassed GLKView, added a pan gesture recogniser on it, and whenever the user do something I draw points on my View (more complicated than that but you got it).

Here is how I initialise my video

    NSError *error = nil;

    NSURL *url =  [NSURL fileURLWithPath:@"/Users/Dimillian/Documents/DEV/movie.mp4"];
    [[NSFileManager defaultManager]removeItemAtURL:url error:nil];
    self.assetWriter = [[AVAssetWriter alloc] initWithURL:url fileType:AVFileTypeAppleM4V error:&error];
    if (error != nil)
    {
        NSLog(@"Error: %@", error);
    }


    NSMutableDictionary * outputSettings = [[NSMutableDictionary alloc] init];
    [outputSettings setObject: AVVideoCodecH264 forKey: AVVideoCodecKey];
    [outputSettings setObject: [NSNumber numberWithInt: 954] forKey: AVVideoWidthKey];
    [outputSettings setObject: [NSNumber numberWithInt: 608] forKey: AVVideoHeightKey];


    self.assetWriterVideoInput = [AVAssetWriterInput assetWriterInputWithMediaType:AVMediaTypeVideo outputSettings:outputSettings];
    self.assetWriterVideoInput.expectsMediaDataInRealTime = YES;

    // You need to use BGRA for the video in order to get realtime encoding. I use a color-swizzling shader to line up glReadPixels' normal RGBA output with the movie input's BGRA.
    NSDictionary *sourcePixelBufferAttributesDictionary = [NSDictionary dictionaryWithObjectsAndKeys: [NSNumber numberWithInt:kCVPixelFormatType_32BGRA], kCVPixelBufferPixelFormatTypeKey,
                                                           [NSNumber numberWithInt:954], kCVPixelBufferWidthKey,
                                                           [NSNumber numberWithInt:608], kCVPixelBufferHeightKey,
                                                           nil];

    self.assetWriterPixelBufferInput = [AVAssetWriterInputPixelBufferAdaptor assetWriterInputPixelBufferAdaptorWithAssetWriterInput:
                             self.assetWriterVideoInput sourcePixelBufferAttributes:sourcePixelBufferAttributesDictionary];

    [self.assetWriter addInput:self.assetWriterVideoInput];

    self.startTime = [NSDate date];
    self.lastTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);

    [self.assetWriter startWriting];
    [self.assetWriter startSessionAtSourceTime:kCMTimeZero];
}

Now here is a short version of my recogniser

- (void)pan:(UIPanGestureRecognizer *)p {
   // Prepare vertex to be added on screen according to user input
   [self setNeedsDisplay];
}

Now here is my drawrect method

- (void)drawRect:(CGRect)rect
{
    glClearColor(1, 1, 1, 1.0f);
    glClear(GL_COLOR_BUFFER_BIT);

    [effect prepareToDraw];

    //removed code about vertex drawing

    [self capturePixels];
}

And finally my capturePixels function

- (void)capturePixels
{
    glFinish();

    CVPixelBufferRef pixel_buffer = NULL;

    CVReturn status = CVPixelBufferPoolCreatePixelBuffer (NULL, self.assetWriterPixelBufferInput.pixelBufferPool, &pixel_buffer);
    if ((pixel_buffer == NULL) || (status != kCVReturnSuccess))
    {
        NSLog(@"%d", status);
        NSLog(@"VIDEO FAILED");
        return;
    }
    else
    {
        CVPixelBufferLockBaseAddress(pixel_buffer, 0);
        glReadPixels(0, 0, 954, 608, GL_RGBA, GL_UNSIGNED_BYTE, CVPixelBufferGetBaseAddress(pixel_buffer));
    }

    // May need to add a check here, because if two consecutive times with the same value are added to the movie, it aborts recording
    CMTime currentTime = CMTimeMakeWithSeconds([[NSDate date] timeIntervalSinceDate:self.startTime],120);

    if(![self.assetWriterPixelBufferInput appendPixelBuffer:pixel_buffer withPresentationTime:currentTime])
    {
        NSLog(@"Problem appending pixel buffer at time: %lld", currentTime.value);
    }
    else
    {
        NSLog(@"%@", pixel_buffer);
        NSLog(@"Recorded pixel buffer at time: %lld", currentTime.value);
        self.lastTime = currentTime;
    }
    CVPixelBufferUnlockBaseAddress(pixel_buffer, 0);

    CVPixelBufferRelease(pixel_buffer);
}

I have another function to close the video input.

- (void)tearDownGL
{
    NSLog(@"Tear down");

    [self.assetWriterVideoInput markAsFinished];
    [self.assetWriter endSessionAtSourceTime:self.lastTime];
    [self.assetWriter finishWritingWithCompletionHandler:^{
        NSLog(@"finish video");
    }];

    [EAGLContext setCurrentContext:context];

    glDeleteBuffers(1, &vertexBuffer);
    glDeleteVertexArraysOES(1, &vertexArray);

    effect = nil;

    glFinish();

    if ([EAGLContext currentContext] == context) {
        [EAGLContext setCurrentContext:nil];
    }
    context = nil;
}

Which seems to works as I have no error, and at the end the video have the correct length, but it's only black... I'm nowhere an expert in OpenGL, it's only a tiny part of my iOS application, I want to learn it, I'm doing my best, and thanks from the posts from @BradLarson (OpenGL ES 2.0 to Video on iPad/iPhone) I've been able to make progress, but I'm really stuck now.

Community
  • 1
  • 1
Dimillian
  • 3,616
  • 4
  • 33
  • 53
  • Do you capture the pixels before you present the render buffer? The order should be: Draw, read pixels, present. If this is not the case try pinpointing the issue a bit: Before the capturePixels is called clear the color buffer to red (or some other non black color), then read just the first pixel onto some temporary buffer and log it to see its colour (GLubyte buffer[4]). If it has a correct colour your issue is in appending pixel buffer to video. If it is black you have an issue with your GL code. – Matic Oblak Mar 26 '14 at 07:33
  • Also CVPixelBufferUnlockBaseAddress(pixel_buffer, 0) should be called right after you read the pixels. – Matic Oblak Mar 26 '14 at 07:46
  • @MaticOblak As you can see I'm capturing the frame in the drawrect method, which is called each time a pan gesture is reconized. The view is drawn in drawrect: but I'm not sure when it's presented. – Dimillian Mar 26 '14 at 09:51
  • Thats great, lets assume that is not the issue... So as I already said: try pinpointing the issue as described above. But first try to move that unlocking function call to where it belongs and see if the issue persists. – Matic Oblak Mar 26 '14 at 13:53
  • @MaticOblak I tried moving the unlocking function, still the same. I'll try to dig into the issue. I tried something else too, if instead of glReadPixels I use [self snapshot] for feeding the buffer it works.... It's slow as hell but it's working. But making a video from images is not the way I want to do it. – Dimillian Mar 26 '14 at 13:55

0 Answers0