3

I try to add a B&W filter to the camera images of an ARSCNView, then render colored AR objects over it.

enter image description here

I'am almost there with the following code added to the beginning of - (void)renderer:(id<SCNSceneRenderer>)aRenderer updateAtTime:(NSTimeInterval)time

CVPixelBufferRef bg=self.sceneView.session.currentFrame.capturedImage;

if(bg){
    char* k1 = CVPixelBufferGetBaseAddressOfPlane(bg, 1);
    if(k1){
        size_t x1 = CVPixelBufferGetWidthOfPlane(bg, 1);
        size_t y1 = CVPixelBufferGetHeightOfPlane(bg, 1);
        memset(k1, 128, x1*y1*2);
    }
}

This works really fast on mobile, but here's the thing: sometimes a colored frame is displayed. I've checked and my filtering code is executed but I assume it's too late, SceneKit's pipeline already processed camera input.

Calling the code earlier would help, but updateAtTime is the earliest point one can add custom frame by frame code.

Getting notifications on frame captures might help, but looks like the whole AVCapturesession is unaccessible.

The Metal ARKit example shows how to convert the camera image to RGB and that is the place where I would do filtering, but that shader is hidden when using SceneKit.

I've tried this possible answer but it's way too slow.

So how can I overcome the frame misses and convert the camera feed reliably to BW?

diviaki
  • 428
  • 2
  • 15
  • Just a ideas I have not tested myself: Could you use a [shaderModifier](https://developer.apple.com/documentation/scenekit/scnshadable#1654834) to convert the background image to black and white by adding a fragment shader? – jlsiewert Aug 29 '17 at 21:29
  • @orangenkopf definitely a way to go if the CVPixelBuffer method can not be fixed. What would you attach the shaderModifier to? – diviaki Aug 30 '17 at 07:02
  • According the the [`ARSCNView`](https://developer.apple.com/documentation/arkit/arscnview) docs the frame gets set as the scenes [`background`](https://developer.apple.com/documentation/scenekit/scnscene/1523665-background) which is a material property where you can attach a shader modifier. According to the comments [here](https://stackoverflow.com/questions/45321275/how-to-place-3d-model-using-arkit-ios-11-with-some-custom-background-view-withou/45324663#45324663) it might not be that easy though – jlsiewert Aug 30 '17 at 09:46
  • @orangenkopf wish that was doable, in fact material properties - like `background` - does not have shader modifiers, only materials and geometries do – diviaki Aug 30 '17 at 11:13
  • I implemented this in swift and it works well. How can I darken the image? – mehulmpt Mar 21 '19 at 14:01
  • @mehulmpt my assumption is that brightness data is on plane 0, just copy the code, switch plane indexes from 1 to 0, and start experimenting with lowering the values in that plane. A simple memset will not do, you'll need to do calculations when altering data so using the Accelerate framework might be handy. – diviaki Mar 22 '19 at 10:35
  • @diviaki I've been studying YCbCr model since yesterday, and yeah I experimented with plane 0 as well. I was able to get individual pixel access using `var pixels = pixelBufferAddressOfPlane.assumingMemoryBound(to: UInt8.self)` on plane 0 and then `pixels` is an array containing that info. However, I'm clueless about how to manipulate this data correctly. If I set it to a single number (say 0 or 255), it seems like I'm getting only chroma data back with a funny image. However, if I try to reduce their value by a factor of 2 (by dividing each pixel value by 2), it either crashes or is very slow – mehulmpt Mar 22 '19 at 14:05
  • .. and the divide by 2 effect doesn't work either (i see regular live feed only, it's just very jittery). how do I go about that? would really appreciate any help – mehulmpt Mar 22 '19 at 14:06
  • 1
    @mehulmpt Grayscaling is easy as it simply clears color channels. If you want to darken the grayscale image, that should be simple as well by dividing Y values with Accelerate. However darkening the color image requires modifying all channels applying non-trivial calculations. Accelerate could help, but I would consider the alternate way that's easier to understand by using a shader to manipulate the RGB data. – diviaki Mar 25 '19 at 10:23

1 Answers1

2

Here's the key for this problem:

session:didUpdateFrame:

Provides a newly captured camera image and accompanying AR information to the delegate.

So just moved CVPixelBufferRef manipulation, the image filtering code from

- (void)renderer:(id<SCNSceneRenderer>)aRenderer updateAtTime:(NSTimeInterval)time

to

- (void)session:(ARSession *)session didUpdateFrame:(ARFrame *)frame

Made sure to set self.sceneView.session.delegate = self to have this delegate called.

Community
  • 1
  • 1
diviaki
  • 428
  • 2
  • 15