2

Is it possible to import a virtual lamp object into the AR scene, that projects a light cone, which illuminates the surrounding space in the room and the real objects in it, e.g. a table, floor, walls?

For ARKit, I found this SO post.

For ARCore, there is an example of relighting technique. And this source code.

I have also been suggested that post-processing can be used to brighten the whole scene.

However, these examples are from a while ago and perhaps threre is a newer or a more straight forward solution to this problem?

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
Yaro
  • 148
  • 8
  • 1
    @andy-jazz I have been reading your post on Medium on a similar matter. How can I contact you? – Yaro Dec 15 '21 at 17:40
  • Hi @Yaro! What type of light is really needed – 1) virtual light for virtual objects (i.e. rendered by RealityKit engine), or 2) virtual light for real-world objects, or 3) real-world-light for virtual objects (i.e. Light Estimation feature) ??? – Andy Jazz Dec 15 '21 at 20:11
  • 1
    Number 2 - virtual light for real-world objects. The idea is to import a virtual lamp in AR and simulate how this lamp would illuminate the real room and the real objects around it. (I also have updated the question with some findings) – Yaro Dec 16 '21 at 12:09

2 Answers2

1

At the low level, RealityKit is only responsible for rendering virtual objects and overlaying them on top of the camera frame. If you want to illuminate the real scene, you need to post-process the camera frame.

Here are some tutorials on how to do post-processing: Tutorial1⃣️ Tutorial2⃣️


If all you need is an effect like This , then all you need to do is add a CGImage-based post-processing effect for the virtual object (lights).

More specifically, add a bloom filter to the rendered image(You can also simulate bloom filters with Gaussian blur).

In this way, the code is all around UIImage and CGImage, so it's pretty simple

If you want to be more realistic, consider using the depth map provided by LiDAR to calculate which areas can be illuminated for a more detailed brightness.


Or If you're a true explorer, you can use Metal to create a real world Digital Twin point cloud in real time to simulate occlusion of light.

闪电狮
  • 367
  • 2
  • 9
  • Thank you for the tip, that the scene can be post processed. However, I am still curious if AR sdks provide functionality for manipulating the scene directly. – Yaro Dec 14 '21 at 22:33
  • 1
    In RealityKit, the scene is just an invisible mesh (without textures) that handles occlusion and physics. So even if you can manipulate the scene, you only modify the mesh. In AR, the real world is just an image, and to illuminate the real world you have to go through post-processing. – 闪电狮 Dec 16 '21 at 17:16
1

There's nothing new in relighting techniques based on 3D compositing principles in 2021. At the moment, when you're working with RealityKit or SceneKit, you have to personally implement the relighting functionality with the help of two additional render passes (RGB pass is always needed) - Normals pass and PointPosition pass. Both AOVs must be 32-bit.

enter image description here

However, in the near future, when Apple engineers finally implement texture capturing in Scene Reconstruction – any inexperienced AR developer will be able to apply a relighting procedure.


Watch this Vimeo Video to find out how relighting can be achieved in The Foundry NUKE.


A crucial point here, when implementing the Relighting effect, is the presence of a LiDAR scanner (or iToF sensor if you're using ARCore). In other words, today's relighting solution for iOS is Metal + RealityKit.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220