10

My goal is to present 2D animated characters in the real environment using ARKit. The animated characters are part of a video at presented in the following snapshot from the video:

Snapshot from the video

Displaying the video itself was achieved with no problem at all using the code:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)
    let item = AVPlayerItem(asset: asset)
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}

The result of this code is presented in the screen shot from the app below as expected:

App screenshot #1

But as you can see, the background of the characters isn't very nice, so I need to make it vanish, in order to create the illusion of the characters actually standing on the horizontal plane surface. I'm trying to achieve this by making a chroma-key effect to the video.

  • For those who are not familiar with chroma-key, this is name of the "green screen effect" seen sometimes on TV to make a color transparent.

My approach to the chroma-key effect is to create a custom filter based on "CIColorCube" CIFilter, and then apply the filter to the video using AVVideoComposition.

First, is the code for creating the filter:

func RGBtoHSV(r : Float, g : Float, b : Float) -> (h : Float, s : Float, v : Float) {
    var h : CGFloat = 0
    var s : CGFloat = 0
    var v : CGFloat = 0
    let col = UIColor(red: CGFloat(r), green: CGFloat(g), blue: CGFloat(b), alpha: 1.0)
    col.getHue(&h, saturation: &s, brightness: &v, alpha: nil)
    return (Float(h), Float(s), Float(v))
}

func colorCubeFilterForChromaKey(hueAngle: Float) -> CIFilter {

    let hueRange: Float = 20 // degrees size pie shape that we want to replace
    let minHueAngle: Float = (hueAngle - hueRange/2.0) / 360
    let maxHueAngle: Float = (hueAngle + hueRange/2.0) / 360

    let size = 64
    var cubeData = [Float](repeating: 0, count: size * size * size * 4)
    var rgb: [Float] = [0, 0, 0]
    var hsv: (h : Float, s : Float, v : Float)
    var offset = 0

    for z in 0 ..< size {
        rgb[2] = Float(z) / Float(size) // blue value
        for y in 0 ..< size {
            rgb[1] = Float(y) / Float(size) // green value
            for x in 0 ..< size {

                rgb[0] = Float(x) / Float(size) // red value
                hsv = RGBtoHSV(r: rgb[0], g: rgb[1], b: rgb[2])
                // TODO: Check if hsv.s > 0.5 is really nesseccary
                let alpha: Float = (hsv.h > minHueAngle && hsv.h < maxHueAngle && hsv.s > 0.5) ? 0 : 1.0

                cubeData[offset] = rgb[0] * alpha
                cubeData[offset + 1] = rgb[1] * alpha
                cubeData[offset + 2] = rgb[2] * alpha
                cubeData[offset + 3] = alpha
                offset += 4
            }
        }
    }
    let b = cubeData.withUnsafeBufferPointer { Data(buffer: $0) }
    let data = b as NSData

    let colorCube = CIFilter(name: "CIColorCube", withInputParameters: [
        "inputCubeDimension": size,
        "inputCubeData": data
        ])
    return colorCube!
}

And then the code for applying the filter to the video by modifying the function func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? that I wrote earlier:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    guard let urlString = Bundle.main.path(forResource: "resourceName", ofType: "mp4") else { return nil }

    let url = URL(fileURLWithPath: urlString)
    let asset = AVAsset(url: url)

    let filter = colorCubeFilterForChromaKey(hueAngle: 38)
    let composition = AVVideoComposition(asset: asset, applyingCIFiltersWithHandler: { request in
        let source = request.sourceImage
        filter.setValue(source, forKey: kCIInputImageKey)
        let output = filter.outputImage

        request.finish(with: output!, context: nil)
    })

    let item = AVPlayerItem(asset: asset)
    item.videoComposition = composition
    let player = AVPlayer(playerItem: item)

    let videoNode = SKVideoNode(avPlayer: player)
    videoNode.size = CGSize(width: 200.0, height: 150.0)
    videoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    return videoNode
}

The code is supposed to replace all pixels of each frame of the video to alpha = 0.0 if the pixel color match the hue range of the background. But instead of getting transparent pixels I'm getting those pixels black as can be seen in the image below:

App screenshot #2

Now, even though this is not the wanted effect, it does not surprise me, as I knew that this is the way iOS displays videos with alpha channel. But here is the real problem - When displaying a normal video in an AVPlayer, there is an option to add an AVPlayerLayer to the view, and to set pixelBufferAttributes to it, to let the player layer know we use a transparent pixel buffer, like so:

let playerLayer = AVPlayerLayer(player: player)
playerLayer.bounds = view.bounds
playerLayer.position = view.center
playerLayer.pixelBufferAttributes = [(kCVPixelBufferPixelFormatTypeKey as String): kCVPixelFormatType_32BGRA]
view.layer.addSublayer(playerLayer)

This code gives us a video with transparent background (GOOD!) but a fixed size and position (NOT GOOD...), as you can see in this screenshot:

App screenshot #3

I want to achieve the same effect, but on SKVideoNode, and not on AVPlayerLayer. However, I can't find any way to set pixelBufferAttributes to SKVideoNode, and setting a player layer does not achieve the desired effect of ARKit as it is fixed in position.

Is there any solution to my problem, or maybe is there another technique to achieve the same desired effect?

2shy
  • 318
  • 3
  • 15
  • Not sure it’ll work, but you could try doing it in 3D with SceneKit (`ARSCNView`) and using `AVPlayer` as the material contents for a plane. – rickster May 04 '18 at 22:06
  • 1
    Check section 1 ' Transparent Videos in SpriteKit' of this Medium post: https://medium.com/@quentinfasquel/ios-transparent-video-in-spritekit-then-scenekit-2fc66b8706a6 – KNV May 17 '18 at 09:41
  • @KNV Quite funny, I talked with Quentin Fasquel who wrote this article on medium. He gave me a hint to use the SKEffectNode, but he is using a shader and I used a filter in my final solution. – 2shy May 20 '18 at 21:06
  • @rickster This might work but I rather not use 3D in this specific case. – 2shy May 20 '18 at 21:08
  • @2shy hehe nice! Maybe you can help me a little bit because I'm using your colorCubeFilterForChromaKey and RGBtoHSV code. I'm adding the effectNode as a child of SKScene since I want the video to play in a SCNPlane. But when I do this, the filtering is not working... – KNV May 22 '18 at 09:09
  • You guys might be interested in a green screen for iOS blog post that I wrote up, linked from the SO answer about green screen vs alpha channel question: https://stackoverflow.com/a/35004399/763355 – MoDJ Jul 02 '18 at 20:45

2 Answers2

9

The solution is quite simple! All that needs to be done is to add the video as a child of a SKEffectNode and apply the filter to the SKEffectNode instead of the video itself (the AVVideoComposition is not necessary). Here is the code I used:

func view(_ view: ARSKView, nodeFor anchor: ARAnchor) -> SKNode? {
    // Create and configure a node for the anchor added to the view's session.
    let bialikVideoNode = videoNodeWith(resourceName: "Tsina_05", ofType: "mp4")
    bialikVideoNode.size = CGSize(width: kDizengofVideoWidth, height: kDizengofVideoHeight)
    bialikVideoNode.anchorPoint = CGPoint(x: 0.5, y: 0.0)

    // Make the video background transparent using an SKEffectNode, since chroma-key doesn't work on video
    let effectNode = SKEffectNode()
    effectNode.addChild(bialikVideoNode)
    effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 120)

    return effectNode
}

And here is the result as needed: enter image description here

2shy
  • 318
  • 3
  • 15
  • is it possible to remove only one color from the video – Mashhadi Jan 20 '19 at 05:42
  • @Mashhadi You can reduce the value of hueRange in the colorCubeFilterForChromaKey(hueAngle:) function to remove a small range of hues (almost a single color). – 2shy Jan 20 '19 at 16:09
  • I have a video having simple black color so changed my hue to effectNode.filter = colorCubeFilterForChromaKey(hueAngle: 1) but still don't works – Mashhadi Jan 21 '19 at 04:43
  • @Mashhadi This only works with colors other than black and white as black and white aren't considered as hues... Sorry my friend. – 2shy Jan 22 '19 at 13:34
  • This is great.. how do you define a different color than that yellow, for example pure red, as the background color? As much as I try it only seems to make blacks transparent for some reason. – Jacobo Koenig Apr 14 '19 at 23:01
  • @JacoboKoenig The value for hueAngle determines the color of the background. In my example code the value is 120 which is green. Pure red is either 0 or 360. – 2shy Apr 15 '19 at 09:00
2

Thank you! Had the same problem + mixing [AR/Scene/Sprite]Kit. But I would recommend to use this algorithm instead. It gives a better result:

...
var r: [Float] = removeChromaKeyColor(r: rgb[0], g: rgb[1], b: rgb[2])
                cubeData[offset] = r[0]
                cubeData[offset + 1] = r[1]
                cubeData[offset + 2] = r[2]
                cubeData[offset + 3] = r[3]
                offset += 4
...

func removeChromaKeyColor(r: Float, g: Float, b: Float) -> [Float] {
    let threshold: Float = 0.1
    let refColor: [Float] = [0, 1.0, 0, 1.0]    // chroma key color

    //http://www.shaderslab.com/demo-40---video-in-video-with-green-chromakey.html
    let val = ceil(saturate(g - r - threshold)) * ceil(saturate(g - b - threshold))
    var result = lerp(a: [r, g, b, 0.0], b: refColor, w: val)
    result[3] = fabs(1.0 - result[3])

    return result
}

func saturate(_ x: Float) -> Float {
    return max(0, min(1, x));
}

func ceil(_ v: Float) -> Float {
    return -floor(-v);
}

func lerp(a: [Float], b: [Float], w: Float) -> [Float] {
    return [a[0]+w*(b[0]-a[0]), a[1]+w*(b[1]-a[1]), a[2]+w*(b[2]-a[2]), a[3]+w*(b[3]-a[3])];
}
myz
  • 293
  • 2
  • 10
  • Thank you! This filters the green colors better than the accepted answer. – Mikkel Cortnum Dec 14 '20 at 16:31
  • Hey @myz how do i change the color that is being made transparent. It seems from your code that it should be the refColor, however changing this doesnt make a difference. I would really appreciate the help :) – Mikkel Cortnum Feb 11 '21 at 14:18
  • Sorry, I didn't have access to this old code anymore, but yes, refColor should be the transparent color: (red, green, blue, alpha=1.0) [0.0-1.0]. – myz Feb 12 '21 at 15:52
  • Yeah that's what I thought. It doesn't seem to work though. No matter what I change the refColor to it filters out the green. – Mikkel Cortnum Apr 30 '21 at 14:00