3

I am looking to invert the color of the screen in my app while using the camera. However, I do not know if (1) it is possible and (2) how to do it if it is.

Thanks!

user1553136
  • 328
  • 5
  • 17
  • My thoughts are you could overlay the live preview view with a view which would be inverted. And then post process the image to match.I did post a answer but have removed it as it was not really well thought out therefore helpful. But I wanted to leave the idea here. – markhunte Feb 01 '14 at 17:25
  • I found this link which may be useful: http://weblog.invasivecode.com/post/23153661857/a-quasi-real-time-video-processing-on-ios-in quote from page [we need to build a custom camera preview. If we want to process a video buffer and show the result in real-time, we cannot use the AVCaptureVideoPreviewLayer as shown in this post, because that camera preview renders the signal directly and does not offer any way to process it, before the rendering. To make this possible, you need to take the video buffer, process it and then render it on a custom CALayer. Let’s see how to do that] – markhunte Feb 01 '14 at 17:54

2 Answers2

1

you can use Core Image to do this or you can use Brad Larson's great framework GPUImage

to do a lot of video effects easily, the framework includes a lot of examples take a look.

Eyal
  • 10,777
  • 18
  • 78
  • 130
  • I would go for GPUImage in this case as it seems to be significantly faster and it's easier to use with an AVCaptureSession. – Garoal Feb 01 '14 at 17:34
  • In particular, you want the GPUImageColorInvertFilter here. If you feed a GPUImageVideoCamera into that, then send the result to a GPUImageView, you'll get what the right result here on live video. See the FilterShowcase example to observe this in action. – Brad Larson Feb 01 '14 at 19:34
0

Well, you can try to use Core Image, which has a filter called CIColorInvert. I'm not sure that it will work with camera, but it is worth to give it a try. Here a some links, that you might find helpful:

Besides it, have a look at this answer.

Hope this helps!

Community
  • 1
  • 1
etolstoy
  • 1,798
  • 21
  • 33
  • 1
    Thanks! It would have been great to apply this once a photo has been taken. However, for the need of the app, I need it to be done live while using the camera and I do not see how to apply Core Image in that case, but thanks! – user1553136 Feb 01 '14 at 16:20
  • I've added another link to my answer, it seems to be quite useful. – etolstoy Feb 01 '14 at 16:22
  • Some filters can be attached to a video feed. It looks like CIColorInvert is one of them. (If you look in the docs for that filter it says it is a member of CICategoryVideo, or video filters) – Duncan C Feb 01 '14 at 16:48
  • Sadly `NSArray *filters = [[NSArray alloc] initWithObjects:[CIFilter filterWithName:@"CIColorInvert"], nil]; [captureLayer setFilters:filters];` has no effect. It means I should use the GPUImage framework or is there anything I forget. – user1553136 Feb 01 '14 at 17:45