The GARAugmentedFace class contains a mesh, which contains the 3D vertex locations. For example, in the SCNSceneRendererDelegate method:
if let face = frame.face {
print(face.mesh.vertices[467])
}
https://developers.google.com/ar/reference/ios/interface_g_a_r_augmented_face_mesh
All of the vertex positions are in their default positions, i.e. in the canonical_face_mesh. If you want the vertex positions from the individual face in the camera, you need to get that info from the scenekit geometry, which is conveniently stored in the faceTextureNode. It is somewhat less convenient to extract the data.
faceTextureNode.geometry = faceMeshConverter.geometryFromFace(face)
faceVertices = vertexPositionFromSCNGeometry(geometry: faceTextureNode.geometry!)
private func vertexPositionFromSCNGeometry(geometry: SCNGeometry) -> [SCNVector3] {
let geometrySources = geometry.sources(for: SCNGeometrySource.Semantic.vertex)
var geometryVertices = [SCNVector3()]
if let geometrySource = geometrySource.first{
let stride = geometrySource.dataStride
...
let vector = SCNVector3Make(buffer[0], buffer[1], buffer[2])
return vector
})
geometryVertices = vertices
}
return geometryVertices
}
The rest of the function can be filled in from the detailed answer here:
https://stackoverflow.com/a/66748865/14437826
The vertex positions are relative to the face center transform. If you want the raw locations, you would have to calculate using the center transform.