Save Arfacegeometry to Obj File

Save ARFaceGeometry to OBJ file

The core issue is that your vertex data isn't described correctly. When you provide a vertex descriptor to Model I/O while constructing a mesh, it represents the layout the data actually has, not your desired layout. You're supplying two vertex buffers, but your vertex descriptor describes an interleaved data layout with only one vertex buffer.

The easiest way to remedy this is to fix the vertex descriptor to reflect the data you're providing:

let vertexDescriptor = MDLVertexDescriptor()
// Attributes
vertexDescriptor.attributes[0] = MDLVertexAttribute(name: MDLVertexAttributePosition,
format: .float3,
offset: 0,
bufferIndex: 0)
vertexDescriptor.attributes[1] = MDLVertexAttribute(name: MDLVertexAttributeTextureCoordinate,
format: .float2,
offset: 0,
bufferIndex: 1)
// Layouts
vertexDescriptor.layouts[0] = MDLVertexBufferLayout(stride: MemoryLayout<float3>.stride)
vertexDescriptor.layouts[1] = MDLVertexBufferLayout(stride: MemoryLayout<float2>.stride)

When you later call addNormals(...), Model I/O will allocate the necessary space and update the vertex descriptor to reflect the new data. Since you're not rendering from the data and are instead immediately exporting it, the internal layout it chooses for the normals isn't important.

A correctly exported ARKit face mesh

ARKit – How to convert a PNG file into a OBJ file?

There's no way to convert a 2D .png raster file into 3D .obj geometry file. These formats are different like green apple and small button...

Although, the simplest way to see your image in ARKit's or SceneKit's 3D environment is to assign it on a 3D plane as a texture. Here's how you can do it:

@IBOutlet var sceneView: ARSCNView!

sceneView.scene = SCNScene()

let config = ARWorldTrackingConfiguration()
sceneView.session.run(config)

let node = SCNNode()
node.geometry = SCNPlane(width: 1.0, height: 1.0)
node.geometry?.firstMaterial?.isDoubleSided = true
node.geometry?.firstMaterial?.diffuse.contents = UIImage(named: "dir/image.png")
sceneView.scene.rootNode.addChildNode(node)

Efficiently updating ARSCNFaceGeometry from a set of blend shapes

I could not track down the ARFaceGeometry performance problems. To workaround this, I instead decided to build my own model from ARFaceGeometry.

First I generated a model file from ARFaceGeometry. This file includes the base geometry as well as the geometry when each individual blend shape is applied:

let LIST_OF_BLEND_SHAPES: [ARFaceAnchor.BlendShapeLocation] = [
.eyeBlinkLeft,
.eyeLookDownLeft,
// ... fill in rest
]

func printFaceModelJson() {
// Get the geometry without any blend shapes applied
let base = ARFaceGeometry(blendShapes: [:])!

// First print out a single copy of the indices.
// These are shared between the models
let indexList = base.triangleIndices.map({ "\($0)" }).joined(separator: ",")
print("indexList: [\(indexList)]")

// Then print the starting geometry (i.e. no blend shapes applied)
printFaceNodeJson(blendShape: nil)

// And print the model with each blend shape applied
for blend in LIST_OF_BLEND_SHAPES {
printFaceNodeJson(blendShape: blend)
}
}

func printFaceNodeJson(
blendShape: ARFaceAnchor.BlendShapeLocation?
) {
let geometry = ARFaceGeometry(blendShapes: blendShape != nil ? [blendShape!: 1.0] : [:])!

let verticies = geometry.vertices.flatMap({ v in [v[0], v[1], v[2]] })
let vertexList = verticies.map({ "\($0)" }).joined(separator: ",")
print("{ \"blendShape\": \(blendShape != nil ? "\"" + blendShape!.rawValue + "\"" : "null"), \"verticies\": [\(vertexList)] }")
}

I ran this code offline to generate the model file (quickly converting the output by hand to proper json). You could also use a proper 3D model file format, which would likely result in a smaller model file.

Then for my app, I reconstruct the model from the json model file:

class ARMaskFaceModel {

let node: SCNNode

init() {
let data = loadMaskJsonDataFromFile() // implement this!

let elements = [SCNGeometryElement(indices: data.indicies, primitiveType: .triangles)]

// Create the base geometry
let baseGeometryData = data.blends[0]
let geometry = SCNGeometry(sources: [
SCNGeometrySource(vertices: baseGeometryData.verticies)
], elements: elements)

node = SCNNode(geometry: geometry)

// Then load each of the blend shape geometries into a morpher
let morpher = SCNMorpher()
morpher.targets = data.blends.dropFirst().map({ x in
SCNGeometry(sources: [
SCNGeometrySource(vertices: x.verticies)
], elements: elements)
})
node.morpher = morpher
}

/// Apply blend shapes to the model
func update(blendShapes: [ARFaceAnchor.BlendShapeLocation : NSNumber]) {
var i = 0
for blendShape in LIST_OF_BLEND_SHAPES {
if i > node.morpher?.targets.count ?? 0 {
return
}
node.morpher?.setWeight(CGFloat(truncating: blendShapes[blendShape] ?? 0.0), forTargetAt: i)
i += 1
}
}
}

Not ideal, but it works ok and performs much better, even without any optimization. We're pretty consistently at 60fps now. Plus it works on older phones too! (although printFaceModelJson must be run on a phone that supports realtime face tracking)



Related Topics



Leave a reply



Submit