4

I'm working on a video conferencing app, and the following code draws a frame to the screen succesfully:

 -(int)drawFrameOnMainThread{
    if(mBitmapContext){
        if(mDisplay){
            CGImageRef imageRef = CGBitmapContextCreateImage(mBitmapContext);
    #if TARGET_OS_IPHONE
            UIImage *image = [UIImage imageWithCGImage:imageRef];

            [self performSelectorOnMainThread:@selector(setImage:) withObject:image waitUntilDone:YES];

    #elif TARGET_OS_MAC
            [mDisplay setCurrentImage:imageRef];
    #endif
            CGImageRelease(imageRef);
        }
    }
    return 0;
}

I want to apply a CIFilter to the frame being drawn, so I modify the iOS section of the code like so:

UIImage *image = [UIImage imageWithCGImage:imageRef];

        CIImage *beginImage = image.CIImage;

        CIContext *context = [CIContext contextWithOptions:nil];

        CIFilter *filter = [CIFilter filterWithName:@"CISepiaTone" 
                                      keysAndValues: kCIInputImageKey, beginImage, 
                            @"inputIntensity", [NSNumber numberWithFloat:0.8], nil];
        CIImage *outputImage = [filter outputImage];

        CGImageRef cgimg = 
        [context createCGImage:outputImage fromRect:[outputImage extent]];
        UIImage *newImg = [UIImage imageWithCGImage:cgimg];

[self performSelectorOnMainThread:@selector(setImage:) withObject:newImg waitUntilDone:YES];

The result is my video screen stays black. Can anybody see the error here? Ive been at this for a few hours now and cant figure it out.

Pere Villega
  • 16,429
  • 5
  • 63
  • 100
Dimitar08
  • 450
  • 7
  • 15
  • 1
    Without addressing the black screen, I'll just say that doing any kind of video using UIImage sources and Core Image is going to be incredibly slow on an actual device, unless your video is the size of a postage stamp. Going to and from a UIImage has a lot of Core Graphics overhead, and then manually redrawing that image in a view is going to add even more slowdown. Also, as I benchmarked here: http://stackoverflow.com/a/6628208/19679 Core Image isn't as fast as it should be on iOS. – Brad Larson Apr 25 '12 at 01:25
  • If you're pulling these frames of video from the camera, you'll really want to use OpenGL ES or something like my GPUImage framework to get performant video filtering and display. It might be a little harder if these are video frames pulled from a network source, but you'll still want to upload and display these frames using OpenGL ES (or OpenGL for your Mac client). – Brad Larson Apr 25 '12 at 01:28
  • I am actually pulling the video from a network source, and without any filtering, the performance is great. (I'm using [link](http://code.google.com/p/idoubs/) ) What would you suggest for filtering? Maybe do it on the server side? – Dimitar08 Apr 25 '12 at 01:43
  • Careful in using that code, it's GPL and that will mean that you'll have to release your entire application source code under the GPL as well. In fact, GPLv3 may not be compatible with an application on the App Store (at least one other has been pulled for this). – Brad Larson Apr 25 '12 at 16:29
  • What device are you testing this on? It might have okay performance on a 4S or iPad 2+, but I bet it's burning a lot of CPU cycles by constantly redrawing the image within your view. If you can get the raw BGRA bytes from your image, you'll be much better off uploading those as a texture for display using OpenGL ES. I do something like this for image sources in GPUImage, but I don't yet have a raw data input class. – Brad Larson Apr 25 '12 at 16:32
  • I'll look into openGL ES, I SHOULD be able to get to the raw data of the image I think. This is for iPad2+. I'd love to get these filters working, even with bad performance, as more of a proof of concept than anything else. In the end we will probably end up doing the video processing on the server side. And yes we've looked at the GPL issues and everything should be ok on that front. – Dimitar08 Apr 25 '12 at 22:06

1 Answers1

4

I've fixed the problem, the issue was with initializing the CIImage in line:

//Wrong

CIImage *beginImage = image.CIImage;

//Right
CIImage *beginImage = [CIImage imageWithCGImage:imageRef];

As Brad said though, the performance is not acceptable. The video lags behind the audio by about 5 seconds on the iPad2. So I'll look into other solutions for this, but I was still happy to see it working as more of a proof of concept than anything else :)

brian d foy
  • 129,424
  • 31
  • 207
  • 592
Dimitar08
  • 450
  • 7
  • 15