9

In an iOS ARKit app, I've been trying to save the ARFaceGeometry data to an OBJ file. I followed the explanation here: How to make a 3D model from AVDepthData?. However, the OBJ isn't created correctly. Here's what I have:

    func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) {
        guard let faceAnchor = anchor as? ARFaceAnchor else { return }
        currentFaceAnchor = faceAnchor

        // If this is the first time with this anchor, get the controller to create content.
        // Otherwise (switching content), will change content when setting `selectedVirtualContent`.
        if node.childNodes.isEmpty, let contentNode = selectedContentController.renderer(renderer, nodeFor: faceAnchor) {
            node.addChildNode(contentNode)
        }

        // https://stackoverflow.com/questions/52953590/how-to-make-a-3d-model-from-avdepthdata
        let geometry = faceAnchor.geometry        
        let allocator = MDLMeshBufferDataAllocator()
        let vertices = allocator.newBuffer(with: Data(fromArray: geometry.vertices), type: .vertex)
        let textureCoordinates = allocator.newBuffer(with: Data(fromArray: geometry.textureCoordinates), type: .vertex)
        let triangleIndices = allocator.newBuffer(with: Data(fromArray: geometry.triangleIndices), type: .index)
        let submesh = MDLSubmesh(indexBuffer: triangleIndices, indexCount: geometry.triangleIndices.count, indexType: .uInt16, geometryType: .triangles, material: MDLMaterial(name: "mat1", scatteringFunction: MDLPhysicallyPlausibleScatteringFunction()))

        let vertexDescriptor = MDLVertexDescriptor()
        // Attributes
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributePosition, format: .float3, offset: 0, bufferIndex: 0))
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributeNormal, format: .float3, offset: MemoryLayout<float3>.stride, bufferIndex: 0))
        vertexDescriptor.addOrReplaceAttribute(MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate, format: .float2, offset: MemoryLayout<float3>.stride + MemoryLayout<float3>.stride, bufferIndex: 0))
        // Layouts
        vertexDescriptor.layouts.add(MDLVertexBufferLayout(stride: MemoryLayout<float3>.stride + MemoryLayout<float3>.stride + MemoryLayout<float2>.stride))

        let mdlMesh = MDLMesh(vertexBuffers: [vertices, textureCoordinates], vertexCount: geometry.vertices.count, descriptor: vertexDescriptor, submeshes: [submesh])
        mdlMesh.addNormals(withAttributeNamed: MDLVertexAttributeNormal, creaseThreshold: 0.5)
        let asset = MDLAsset(bufferAllocator: allocator)
        asset.add(mdlMesh)

        let documentsPath = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
        let exportUrl = documentsPath.appendingPathComponent("face.obj")
        try! asset.export(to: exportUrl)
    }

The resulting OBJ file looks like this:

# Apple ModelIO OBJ File: face
mtllib face.mtl
g 
v -0.000128156 -0.0277879 0.0575149
vn 0 0 0
vt -9.36008e-05 -0.0242016
usemtl material_1
f 1/1/1 1/1/1 1/1/1
f 1/1/1 1/1/1 1/1/1
f 1/1/1 1/1/1 1/1/1
... and many more lines

I would expect many more vertices, and the index values look wrong.

Daniel McLean
  • 451
  • 7
  • 10
  • Are you able to export the face geometry data to obj file. I am getting submesh as unintialized hence it is crashing. Can you please guide me how to export this data. – Chaitu Jan 24 '20 at 11:16
  • @Daniel any luck? – Mayank Jain Sep 14 '20 at 14:29
  • In case you're wondering where did Data(fromArray: ..) came from: https://github.com/opentok/ARFrameMetadata/blob/master/ARFrameMetadata/Data%2BfromArray.swift – Balazs Banyai Jul 29 '23 at 06:51

1 Answers1

9

The core issue is that your vertex data isn't described correctly. When you provide a vertex descriptor to Model I/O while constructing a mesh, it represents the layout the data actually has, not your desired layout. You're supplying two vertex buffers, but your vertex descriptor describes an interleaved data layout with only one vertex buffer.

The easiest way to remedy this is to fix the vertex descriptor to reflect the data you're providing:

let vertexDescriptor = MDLVertexDescriptor()
// Attributes
vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
                                                    format: .float3,
                                                    offset: 0,
                                                    bufferIndex: 0)
vertexDescriptor.attributes[1] = MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate,
                                                    format: .float2,
                                                    offset: 0,
                                                    bufferIndex: 1)
// Layouts
vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: MemoryLayout<float3>.stride)
vertexDescriptor.layouts[1] = MDLVertexBufferLayout(stride: MemoryLayout<float2>.stride)

When you later call addNormals(...), Model I/O will allocate the necessary space and update the vertex descriptor to reflect the new data. Since you're not rendering from the data and are instead immediately exporting it, the internal layout it chooses for the normals isn't important.

A correctly exported ARKit face mesh

warrenm
  • 31,094
  • 6
  • 92
  • 116
  • can you please tell me that can we also capture the series of facial expressions and export to the file, after that when we load this file using some face mesh replicate the same expressions? Can we do that if yes Could you please guide me on how to achieve this – Chaitu Jan 28 '20 at 08:53
  • 1
    Sure. Just implement `session:didUpdateAnchors:` in your session delegate, grab the face anchor, and use the above technique to write one model file per frame. You might want to use a more space-efficient format like USDC or Alembic, as OBJ is very verbose. – warrenm Jan 28 '20 at 18:34
  • Thank you so much for you r suggestion. I am able to transform the series of face Anchor into usdc format. Can we group this series into usdz and play animation? – Chaitu Jan 31 '20 at 07:53
  • I'm not sure what support the USD format has for animating a sequence of meshes, but it definitely seems like you could combine a number of usdc files into a single USD scene stored in a USDZ asset, then animate by cycling through the meshes in turn, "flipbook" style. My only concern would be how to solve the tradeoff between loading all meshes at once (high memory usage) and loading them on-demand (potentially stuttering while reading from disk). – warrenm Jan 31 '20 at 17:39
  • I want to export the texture too. How can I do that – Rahul Dasgupta Jun 19 '20 at 06:16
  • `ARFaceGeometry` doesn't have any texture or material properties. Do you mean you want to export the texture(s) belonging to the material properties of, say, an instance of `ARSCNFaceGeometry`? Please consider asking a separate question with more detail. – warrenm Jun 19 '20 at 20:07
  • It that possible to change the layout with a new MDLVertexDescriptor while creating a new MDLMesh / MTKMesh from original MDLMesh? – iaomw Aug 11 '20 at 13:19
  • You can set the `vertexDescriptor` property on an `MDLMesh` to alter its attributes and layout. This can be a costly operation, since it may allocate new buffers internally, but it is an easy way to "reshape" the vertex data as your application requires. – warrenm Sep 11 '20 at 06:20