Which Format File for 3D Model Scenekit/Arkit Better to Use

Which format file for 3d model SceneKit/ARKit better to use

DAE (Digital Asset Exchange, aka Collada) is a vendor-neutral format for 3D assets. It supports a wide range of features that exist in multiple 3D authoring and presentation tools, but not every possible feature in SceneKit. Historically, it was the only asset format for early versions of SceneKit.

SCN format is a serialization of the SceneKit object graph. (There are convenience methods for reading/writing it on SCNScene, but really it's the same thing you get by passing an SCNScene to NSKeyedArchiver/NSKeyedUnarchiver.) Thus, it by definition supports all features of SceneKit, including physics, constraints, actions, physically based cameras, and shader modifiers.

If you're using DAE assets, deploying to iOS (or tvOS or watchOS), and not seeing any difference vs using SCN assets, there are two possible reasons:

  • Your assets use only those SceneKit features that are available in DAE format.
  • When deploying to iOS/tvOS/watchOS, Xcode (via scntool) automatically converts all 3D asset resources to SCN format. (And applies other transformations, like interleaving geometry buffers, for optimal rendering performance on iOS/tvOS/watchOS devices.) The filename in the built app's Resources directory still has a .dae extension, but the file contents are the same as SCN format.

    (SceneKit running in iOS/tvOS/watchOS actually can't read DAE, so it relies on this preprocessing by Xcode.)

What 3D model formats are supported by ARKit?

DAE and OBJ/MTL are automatically supported, in the sense that you can just drop the files in the .scnassets folder and it will handle them for you. Personally, I had fewer issues with OBJ/MTL but I'm not well versed in 3D.

The documentation for Model I/O states that you can import 3D assets from the following files

The set of supported formats includes Alembic (.abc), Wavefront Object
(.obj), Polygon (.ply), and Standard Tessellation Language (.stl).
Additional formats may be supported as well.

I've not worked with this framework though, so can't tell you how well does it work with ARKit.

And you may want to have a look at AssimpKit which allows to export several formats to .scn SceneKit scenes

3D model formats in ARKit / ARCore development

Updated: May 12, 2022.


SceneKit

Apple SceneKit framework handles 3D models for ARKit and VR apps. SceneKit supports the following 3D assets with corresponding material files:

  • .dae (with or without animations)
  • .obj (single-frame) with its texture and .mtl file
  • .abc (only single-frame supported)
  • .usdz (with or without animations)
  • .scn (native SceneKit's scene format)


RealityKit

Apple RealityKit framework also handles 3D models for ARKit, AR and VR apps. You can prototype a content for RealityKit in a standalone app called Reality Composer. RealityKit supports the following 3D assets:

  • .usdz (with or without animations)
  • .reality (with or without animations and dynamics) – optimized for a much faster loading
  • .rcproject (with or without animations and dynamics)

Additionally you can use Terminal's usdzconvert command to get .usdz from the following formats:

  • .obj
  • .glTF
  • .fbx
  • .abc
  • .usda
  • .usdc
  • .usd

And, of course, you can use Reality Converter app with its simple GUI.


Sceneform

Pity but since June 2020 Sceneform has been archived and no longer maintained by Google.

Google Sceneform handles 3D models for ARCore SDK. Sceneform supports the following 3D assets with their material dependencies:

  • .obj (with its .mtl dependency)
  • .glTF (animations not supported)
  • .fbx (with or without animations)
  • .sfa (ascii asset definition, deprecated in Sceneform 1.16)
  • .sfb (binary asset definition, deprecated in Sceneform 1.16)

SceneKit, RealityKit, Sceneform and Reality Composer support Physically Based Rendering.


ARKit and ARCore

But what's the role of ARKit and ARCore then?

These two AR modules don't care about importing and rendering of a 3D geometry. They are only responsible for tracking (world, image, face, geo, etc) and scene understanding (i.e. plane detection, hit-testing & raycasting, depth perception, light estimation, and geometry reconstruction).

3D model formats in ARKit / ARCore development

Updated: May 12, 2022.


SceneKit

Apple SceneKit framework handles 3D models for ARKit and VR apps. SceneKit supports the following 3D assets with corresponding material files:

  • .dae (with or without animations)
  • .obj (single-frame) with its texture and .mtl file
  • .abc (only single-frame supported)
  • .usdz (with or without animations)
  • .scn (native SceneKit's scene format)


RealityKit

Apple RealityKit framework also handles 3D models for ARKit, AR and VR apps. You can prototype a content for RealityKit in a standalone app called Reality Composer. RealityKit supports the following 3D assets:

  • .usdz (with or without animations)
  • .reality (with or without animations and dynamics) – optimized for a much faster loading
  • .rcproject (with or without animations and dynamics)

Additionally you can use Terminal's usdzconvert command to get .usdz from the following formats:

  • .obj
  • .glTF
  • .fbx
  • .abc
  • .usda
  • .usdc
  • .usd

And, of course, you can use Reality Converter app with its simple GUI.


Sceneform

Pity but since June 2020 Sceneform has been archived and no longer maintained by Google.

Google Sceneform handles 3D models for ARCore SDK. Sceneform supports the following 3D assets with their material dependencies:

  • .obj (with its .mtl dependency)
  • .glTF (animations not supported)
  • .fbx (with or without animations)
  • .sfa (ascii asset definition, deprecated in Sceneform 1.16)
  • .sfb (binary asset definition, deprecated in Sceneform 1.16)

SceneKit, RealityKit, Sceneform and Reality Composer support Physically Based Rendering.


ARKit and ARCore

But what's the role of ARKit and ARCore then?

These two AR modules don't care about importing and rendering of a 3D geometry. They are only responsible for tracking (world, image, face, geo, etc) and scene understanding (i.e. plane detection, hit-testing & raycasting, depth perception, light estimation, and geometry reconstruction).

What is the best settings for ARKit 3D models

I have found that if your object is for instance 50cm in cinema, you need to export it as .dae (1.4, important!) and then select 0.001 meters as scale.

Also, make sure that you position the object at 0, 0, 0 (even if you want it positioned somewhere else in the end, centering it will make editing & repositioning in Xcode easier).

After importing, inside Xcode, click on the .dae file. Select the object in the scene, open the right inspector menu and navigate to the identity inspector.

Then press is movable. This will convert the file to .scn which is better for editing. (is movable should still be unchecked after conversion, if not uncheck, or leave as is)

Then, open the new .scn file and click on your object. You'll see its scale on the right in the identity inspector. If it's still not right, you can change the scale of the bounding box there and also position the object in the scene.

Loading huge animated 3D model in SceneKit causes memory issues

High-poly models with huge texture maps are not suitable for robust AR experience. Augmented Reality frameworks (such as ARKit or ARCore) are very processor-intensive, so there's no need to additionally increase a burden on CPU, GPU and memory.

Why ARKit apps are so CPU-intensive?

Your ARKit app uses 4 sensors to track a surrounding environment at 60 fps, and it simultaneously renders (with a help of SceneKit or RealityKit) your animated 3D model with all the textures, lights and shadows, and, after that, it composites in real-time a 2D render of your model (in RGBAZ pattern) over high-res RGB video from rear camera. That's too much for your device, isn't it?

Hence, any high-poly models with huge textures not only cause a memory and CPU/GPU issues, but also very quickly drain your battery. And, please take into consideration - iPhone X has only 3 GB of RAM of which iOS uses more than 1 GB, so memory issues are quite possible in your particular case.

So, my recommendations for creating a 3D model for robust AR experience are the following:

  • Low-poly geometry (usually 10,000 polygons per model are fine)
  • UV-mapped Texture resolution – not more than 1024 x 1024 pix
  • Preferably, pre-baked UV-mapped shadows for static elements
  • Use of JPEG format with 0% compression for texture (PNG is larger)
  • Don't use too many PBR shaders (with metalness property) in your scene


Related Topics



Leave a reply



Submit