iOS - Could Not Open Obj File When Convert Mdlasset to Mdlmesh

ios - Could not open OBJ file when convert MDLAsset to MDLMesh

I fixed my issue. My issue is that I converted the file before the downloading is finished. Therefore, the local path is created but data is empty because download process hasn't finished yet.

To solve it, I use async to finish downloading first then converting it.

let destination: DownloadRequest.DownloadFileDestination = { _, _ in
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask)[0]
let fileURL = documentsURL.appendingPathComponent("myVase.obj")
return (fileURL, [.removePreviousFile, .createIntermediateDirectories])
}

Alamofire.download(urlString, to: destination).response { response in
if response.error == nil, let filePath = response.destinationURL?.path {
print(imagePath)
let myUrl = "file://" + filePath

let asset = MDLAsset(url:URL(string:myUrl)!)
guard let object = asset.object(at: 0) as? MDLMesh else {
fatalError("Failed to get mesh from asset.")
}
...
}
}

Could not open file obj SceneKit

Your issue is that you open the file before the download process is finished.
To solve it, you can use async to finish the download process then opening the file.

You can see my answer here: ios - Could not open OBJ file when convert MDLAsset to MDLMesh

import 3d model in SceneKit on iOS

I think you are exceeding some internal iOS limit with your OBJ file. Please file a report at https://bugreport.apple.com.

This slightly modified version of your code works perfectly in a macOS playground (Xcode 8.0). But in an iOS playground, I see the same "Could not open OBJ file" in the console.

import SceneKit
import ModelIO
import SceneKit.ModelIO

if let url = URL.init(string: "https://cloud.box.com/shared/static/ock9d81kakj91dz1x4ea.obj") {
let asset = MDLAsset(url: url)
print(asset)
let object = asset.object(at: 0)
print(object)
let node = SCNNode.init(mdlObject: object)
print(node)
}

I was able to download and open the OBJ file with Xcode. Then within the scene editor, I converted it to SCN format. That gave me a .SCN file that could be embedded in the iOS project and opened with SCNScene (like the famous spinning spaceship). So if you can live with embedding a static file in your iOS app, that's a way to get your model in. But if you need dynamically loaded models, it won't work.

Can't feed a MDLMesh container with 3D model as SCNGeometry

This should work:

var scene: SCNScene!
if let filePath = Bundle.main.path(forResource: "Helicopter",
ofType: "usdz",
inDirectory: "art.scnassets") {

let refURL = URL(fileURLWithPath: filePath)
let mdlAsset = MDLAsset(url: refURL)
scene = SCNScene(mdlAsset: mdlAsset)

}

SCNReferenceNode only works for .scn files. You can then get the geometry from a child node of the rootNode of the scene.

let helicopterNode = scene.rootNode.childNode(withName: "helicopter", recursively: true)
let geometry = helicopterNode.geometry!

Edit

Using one of the files from the AR Quick Look Gallery I managed to get this code to work. The main problem that I had was with the name of the specific child node, there was one called "RetroTV" but it did not have any geometry attached to it, it was just the parent for both "RetroTVBody" and "RetroTVScreen." The only problem is that it isn't loading the textures for the geometry.

var scene: SCNScene!
if let filePath = Bundle.main.path(forResource: "retrotv",
ofType: "usdz",
inDirectory: "art.scnassets") {

let refURL = URL(fileURLWithPath: filePath)
let mdlAsset = MDLAsset(url: refURL)
scene = SCNScene(mdlAsset: mdlAsset)

let tvNode = scene.rootNode.childNode(withName: "RetroTVBody", recursively: true)
let geometry = tvNode!.geometry!

} else {

print("invalid path!")

}

The above code also works with the tvNode and geometry declarations outside of the if let statement.

ModelIO Framework Not Working

I've been also experiencing similar issues, although I'm a rookie in Metal, I figured a few things out.

I was trying to import the Melita teapot, but I was also having an "explosion" of faces, instead of the iconic tea-brewing device. The solution came to me after reading the documentation of MDLVertexBufferLayout, which reads:

A mesh may store vertex data in either a structure of arrays model,
where data for each vertex attribute (such as vertex position or
surface normal) lies in a separate vertex buffer, or in an array of
structures model, where multiple vertex attributes share the same
buffer.

  • In a structure of arrays, the mesh’s vertexBuffers array contains
    several MDLMeshBuffer objects, and the mesh’s vertexDescriptor object
    contains a separate MDLVertexBufferLayout object for each buffer.

  • In an array of structures, the mesh contains a single vertex buffer,
    and its descriptor contains a single vertex buffer layout object. To
    identify which bytes in the buffer refer to which vertices and vertex
    attributes, use the layout’s stride together with the format and
    offset properties of the descriptor’s vertex attributes.

By looking at the .layouts and .attributes properties of the default implementation of the MDLVertexDescriptor, they are creating one buffer for each attribute type (like in the quote above, the first case), where I wanted to use the intermixed mode.

I manually set up the .layouts and .attributes with my own arrays and then, voila I got... half a melita pot?

Half-baked implementation... get it!?

class func setup(meshWithDevice device: MTLDevice) -> MTKMesh
{
// Allocator
let allocator = MTKMeshBufferAllocator(device: device)

// Vertex Descriptor, tells the MDLAsset how to layout the buffers
let vertexDescriptor = MDLVertexDescriptor()

// Vertex Buffer Layout, tells how many buffers will be used, and the stride of its structs
// (the init(stide: Int) crashes in the Beta)
let vertexLayout = MDLVertexBufferLayout()
vertexLayout.stride = MemoryLayout<Vertex>.size

// Apply the Layouts
vertexDescriptor.layouts = [vertexLayout]

// Apply the attributes, in my case, position and normal (float4 x2)
vertexDescriptor.attributes =
[
MDLVertexAttribute(name: MDLVertexAttributePosition, format: MDLVertexFormat.float4, offset: 0, bufferIndex: 0),
MDLVertexAttribute(name: MDLVertexAttributeNormal, format: MDLVertexFormat.float4, offset: MemoryLayout<float4>.size, bufferIndex: 0)
]

var error : NSError? = nil

// Load the teapot
let asset = MDLAsset(url: Bundle.main.url(forResource: "teapot", withExtension: "obj")!, vertexDescriptor: vertexDescriptor, bufferAllocator: allocator, preserveTopology: true, error: &error)

if let error = error
{
print(error)
}

// Obtain the teapot Mesh
let teapotModel = asset.object(at: 0) as! MDLMesh

// Convert into MetalKit Mesh, insted of ModelIO
let teapot = try! MTKMesh(mesh: teapotModel, device: device)

return teapot
}

(Swift 3.0 in XCode 8 Beta 6)

I'll update my post if I manage to render the whole thing.

Edit: Works Now

Holy shi---

Whelp, the bug was on my end, I was wrong in the Index count:

//// Buffers
renderPass.setVertexBuffer(mesh.vertexBuffers[0].buffer, offset: 0, at: 0)
renderPass.setVertexBuffer(uniformBuffer, at: 1)

let submesh = mesh.submeshes[0]
let indexSize = submesh.indexType == .uInt32 ? 4 : 2

//// Draw Indices
renderPass.drawIndexedPrimitives(submesh.primitiveType,
indexCount: submesh.indexBuffer.length / indexSize,
indexType: submesh.indexType,
indexBuffer: submesh.indexBuffer.buffer,
indexBufferOffset: 0)

The problem was with let indexSize = submesh.indexType == .uInt32 ? 4 : 2, before I was doing 32 : 16 on the right side, but the .length property comes in Bytes not bits, so dumb.

Anyway, I managed to load an Obj file with Metal, so the problem is either: the one I mentioned above of the individual buffering per attribute, or an entirely different issue in your code.



Related Topics



Leave a reply



Submit