4

ARKit 2.0 added a new class named AREnvironmentProbeAnchor. Reading it's instructions, it seems that ARKit can automatically collect environment texture (cubemap?). I believe that we can now create some virtual objects reflecting the real environment.

But I am still not clear how this work, particularly how the environment texture is generated. Does anyone have simple sample code demonstrating this cool feature?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
redpearl
  • 305
  • 4
  • 15

2 Answers2

2

AREnvironmentProbeAnchor (works in iOS 12.0+) is an anchor for image-based lighting technology. Model's PBR shader can reflect a light from its surroundings. The principle is simple: 6 square images from sources go to the env reflectivity channel of a shading material. These six sources (a rig) have the following directions: +x/-x, +y/-y, +z/-z. The image below illustrates 6 directions of the rig:

enter image description here

Adjacent zFar planes look like a Cube, don't they?

enter image description here

Texture's patches will be available in definite places where your camera scanned the surface. ARKit uses advanced machine learning algorithms to cover a cube with complete 360 degrees textures.

AREnvironmentProbeAnchor serves for positioning this photo-rig at a specific point in the scene. All you have to do is to enable environment texture map generation in an AR session. There are two options for this:

ARWorldTrackingConfiguration.EnvironmentTexturing.manual 

With manual environment texturing, you identify points in the scene for which you want light probe texture maps by creating AREnvironmentProbeAnchor objects and adding them to the session.

ARWorldTrackingConfiguration.EnvironmentTexturing.automatic 

With automatic environment texturing, ARKit automatically creates, positions, and adds AREnvironmentProbeAnchor objects to the session.

In both cases, ARKit automatically generates environment textures as the session collects camera imagery. Use a delegate method such as session(_:didUpdate:) to find out when a texture is available, and access it from the anchor's environmentTexture property.

If you display AR content using ARSCNView and the automaticallyUpdatesLighting option, SceneKit automatically retrieves AREnvironmentProbeAnchor texture maps and uses them to light the scene.

Here's how your code in ViewController.swift may look like:

sceneView.automaticallyUpdatesLighting = true

let torusNode = SCNNode(geometry: SCNTorus(ringRadius: 2, pipeRadius: 1.5))
sceneView.scene.rootNode.addChildNode(torusNode)
    
let reflectiveMaterial = SCNMaterial()
reflectiveMaterial.lightingModel = .physicallyBased
reflectiveMaterial.metalness.contents = 1.0
reflectiveMaterial.roughness.contents = 0
reflectiveMaterial.diffuse.contents = UIImage(named: "brushedMetal.png")
torusNode.geometry?.firstMaterial = [reflectiveMaterial]

let config = ARWorldTrackingConfiguration()
if #available(iOS 12.0, *) {
    config.environmentTexturing = .automatic    // magic happens here
}
sceneView.session.run(config)

Then use a session(...) instance method:

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {

    guard let envProbeAnchor = anchors.first as? AREnvironmentProbeAnchor 
    else { return }

    print(envProbeAnchor.environmentTexture)
    print(envProbeAnchor.extent)
}
Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • Hi Andy, thanks for your reply! I tried setting ARWorldTrackingConfiguration.EnvironmentTexturing and ARSCNView.automaticallyUpdatesLighting, but did not see any differences in the image. Is there a sample demonstrating this new feature? Even a screenshot would help a lot. – redpearl Jun 11 '18 at 20:52
  • I understand how cubemap works in general. But my question is how it work in ARKit with this new feature. Particularly, how the cubemap is generated and how the app will behave. For example, if I put a virtual sphere on the table, its top part should reflect the ceiling. If I have never point my phone up, how can ARKit get the image of ceiling? Or does it require me to point my phone to all directions before the texture is available? If so, how will the app behave? Will the sphere be empty (white?) before the texture is available, and suddenly become reflective after the cubemap is generated? – redpearl Jun 11 '18 at 20:56
  • Yes, I am using iOS 12 beta and XCode 10 – redpearl Jun 11 '18 at 20:56
  • Yes, it DOES require you to point your phone to all directions before the texture is available! – Andy Jazz Jun 11 '18 at 21:01
  • Do you have a sample code or video showing how the whole process look like? – redpearl Jun 11 '18 at 21:07
  • @redpearl so ugh... How did you get your hands on iOS 12 beta? I feel really stupid now thinking that it's not publicly available – Isaaс Weisberg Jun 19 '18 at 08:36
  • @IsaacCarolWeisberg Isaak, iOS 12 beta (as well as any beta) is available for registered developers (those who pay $99 per year). – Andy Jazz Jun 19 '18 at 08:43
  • @redpearl I updated a code in my answer. I tried it several times and it works fine every time. – Andy Jazz Oct 27 '18 at 09:32
  • Does the cubemap rotate with the camera? I've plugged the cubemap into my own engine, and if I sample along the -Z direction I always see what's in front of me... I was expecting the cubemap to be in world coordinates, but it's as if it's in view coordinates. It keeps changing, so I can't tell for sure. – endavid Feb 25 '19 at 23:09
1

Its pretty simple to implement environmentTexturing in your AR project.

  1. Set the environmentTexturing property on your tracking configuration to automatic (ARKit takes the video feed from your camera to automatically create a texture map. The more you move the camera around, the more accurate the texture map becomes. Machine learning is used to fill out the blanks)

    configuration.environmentTexturing = .automatic
    
  2. Environment Texturing requires physically based materials to work. Create a simple shiny sphere to test out the reflections

    let sphere = SCNSphere(radius: 0.1)
    sphere.firstMaterial?.diffuse.contents = UIColor.white
    sphere.firstMaterial?.lightingModel = .physicallyBased
    sphere.firstMaterial?.metalness.intensity = 1.0
    sphere.firstMaterial?.roughness.intensity = 0.0
    let sphereNode = SCNNode(geometry: sphere)
    sceneView.scene.rootNode.addChildNode(sphereNode)
    
SilentK
  • 587
  • 7
  • 20