29

Task

I would like to capture a real-world texture and apply it to a reconstructed mesh produced with a help of LiDAR scanner. I suppose that Projection-View-Model matrices should be used for that. A texture must be made from fixed Point-of-View, for example, from center of a room. However, it would be an ideal solution if we could apply an environmentTexturing data, collected as a cube-map texture in a scene.

enter image description here

Look at 3D Scanner App. It's a reference app allowing us to export a model with its texture.

I need to capture a texture with one iteration. I do not need to update it in a realtime. I realize that changing PoV leads to a wrong texture's perception, in other words, distortion of a texture. Also I realize that there's a dynamic tesselation in RealityKit and there's an automatic texture mipmapping (texture's resolution depends on a distance it captured from).

import RealityKit
import ARKit
import Metal
import ModelIO

class ViewController: UIViewController, ARSessionDelegate {
    
    @IBOutlet var arView: ARView!

    override func viewDidLoad() {
        super.viewDidLoad()

        arView.session.delegate = self
        arView.debugOptions.insert(.showSceneUnderstanding)

        let config = ARWorldTrackingConfiguration()
        config.sceneReconstruction = .mesh
        config.environmentTexturing = .automatic
        arView.session.run(config)
    }
}

Question

  • How to capture and apply a real world texture to a reconstructed 3D mesh?


Andy Jazz
  • 49,178
  • 17
  • 136
  • 220

2 Answers2

16

Scene Reconstruction

Pity but I am still unable to capture model's texture in realtime using the LiDAR scanning process. Neither at WWDC20 nor at WWDC23 Apple announced a native API for that, so texture capturing is only possible now using third-party APIs - don't ask me which ones )).

However, there's good news – a long awaited methodology has emerged at last. It allows developers to create textured models from a series of shots.

Photogrammetry

Object Capture API, announced at WWDC 2021, provides developers with the long-awaited photogrammetry tool. At the output we get USDZ model with UV-mapped hi-rez texture. To implement Object Capture API you need macOS 12+ and Xcode 13+.

enter image description here

To create a USDZ model from a series of shots, submit all taken images to RealityKit's PhotogrammetrySession.

Here's a code snippet that spills a light on this process:

import RealityKit
import Combine

let pathToImages = URL(fileURLWithPath: "/path/to/my/images/")

let url = URL(fileURLWithPath: "model.usdz")

var request = PhotogrammetrySession.Request.modelFile(url: url, 
                                                   detail: .medium)

var configuration = PhotogrammetrySession.Configuration()
configuration.sampleOverlap = .normal
configuration.sampleOrdering = .unordered
configuration.featureSensitivity = .normal
configuration.isObjectMaskingEnabled = false

guard let session = try PhotogrammetrySession(input: pathToImages, 
                                      configuration: configuration)
else { return 
} 

var subscriptions = Set<AnyCancellable>()

session.output.receive(on: DispatchQueue.global())
              .sink(receiveCompletion: { _ in
                  // errors
              }, receiveValue: { _ in
                  // output
              }) 
              .store(in: &subscriptions)

session.process(requests: [request])

You can reconstruct USD and OBJ models with their corresponding UV-mapped textures.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 1
    I am really intrigued with where you are going with this, as the idea of converting the `ARFrame` to a `MTLTexture` seemed like the most sensible approach. I struggled immensely in trying to figure out how to properly apply the texture coordinates to the model, so that the texture would appear as expected. Did you ever succeed with this? Going off of [this](https://stackoverflow.com/questions/61538799/ipad-pro-lidar-export-geometry-texture?rq=1) question, I can convert the ARMeshGeometry to the SCNGeometry, it looks proper without texture, but I never got the coordinates right. – ZbadhabitZ Oct 08 '20 at 00:37
  • 1
    Hi @AndyFedoroff Do you know whether it is possible to export the mesh model using RealityKit? – Randi Jan 20 '21 at 18:56
  • 1
    Hi @Randi, here's 3 answers about it including mine – https://stackoverflow.com/questions/61063571/arkit-3-5-how-to-export-obj-from-new-ipad-pro-with-lidar/61104855#61104855 – Andy Jazz Jan 20 '21 at 19:24
  • I'm very new to Swift language. I don't know how to export models with textures, any sample's please. @AndyFedoroff – Ramgg Jan 21 '21 at 10:26
  • 1
    Hi @Ramgg, Alas, there's no native solution for texture's export at the moment. Try Unity with ARMeshManager... – Andy Jazz Jan 21 '21 at 11:00
5

How it can be done in Unity

I'd like to share some interesting info about the work of Unity's AR Foundation with a mesh coming from LiDAR. At the moment – November 01, 2020 – there's an absurd situation. It's associated with the fact that native ARKit developers cannot capture the texture of a scanned object using standard high-level RealityKit tools, however Unity's AR Foundation users (creating ARKit apps) can do this using the ARMeshManager script. I don't know whether this script was developed by AR Foundation team or just by developers of a small creative startup (and then subsequently bought), but the fact remains.

To use ARKit meshing with AR Foundation, you just need to add the ARMeshManager component to your scene. As you can see on the picture there are such features as Texture Coordinates, Color and Mesh Density.

enter image description here

If anyone has more detailed information on how this must be configured or scripted in Unity, please post about it in this thread.

Andy Jazz
  • 49,178
  • 17
  • 136
  • 220
  • 2
    AFAIK texture coordinates are not supported by the ARKit subsystem in ARFoundation (even if there is a checkbox, it probably won't do anything). If anyone knows more please let us know! :) – Robert Dec 31 '20 at 12:27
  • 1
    @Andy Fedoroff Thanks for your detailled answer, so in the end you'r doing this with Unity, or any update with AR Kit? I have a basic app on the store called PointCloudKit and I'm doing quite some work with points cloud but there might be easier ways... Mainly I'm reconstructing objects from points but I might be able to use the mesh – Acammm Mar 24 '21 at 07:52
  • @Andy Hi Andy, wondered if in the end you implemented your solution using Unity, or if anything changed in ARKit since then, and allowed users to capture the texture of scanned entities without AR Foundation – Acammm Mar 25 '21 at 01:52
  • Alexandre, neither one nor the other. Waiting for RealityKi't's update. I'm 100% sure it brings the desired feature) – Andy Jazz Mar 25 '21 at 06:42
  • 2
    It could be introduced in the upcoming WWDC, but you never know with Apple. Has someone managed to get it working like in the Polycam or Forge apps? I'd be valuable for many developers if they shared. – przemekblasiak May 12 '21 at 16:35
  • 3
    Can anyone confirm if texture capture would actually work using Unity? – Sabaat Ahmad Feb 02 '22 at 21:08