Render a 3D Model (Hair) with Semi-Transparent Texture in Scenekit

Add texture to some part of 3D model

If each part of your model is a separate 3D object, you can easily change a texture for it in Xcode. But if you have a mono-model (so called one-piece model), you can change a texture of a definite part of it only via UV-mapping (you can do it in 3dsMax, Maya, Cinema4D, Blender, etc).

Here are how UV-mapped textures look like:

Sample Image

Sample Image

Consider, Xcode's Scene graph editor isn't suitable for UV-mapping at all. In Xcode you can only apply pre-made UV-mapped textures.

Importing .obj files with textures in SceneKit

Did you check the file path to the .mtl file in the .obj file?
If this path is incorrect then the materials will not load.
OBJ and MTL files are text files so you can use a text editor to open them.
The path to the .mtl file should be at the top of the .obj file:

mtllib mymtlfile.mtl

If they are in the same folder you can just strip the path.

If this path is okay then you should check the paths to the textures in the .mtl file. Look for lines starting with map_. For instance:

map_Kd mydiffusetexture.png
map_Ka /path/to/myambienttexture.tga
map_bump mybumptexture.jpg

If you strip all the paths then the file import should work.

SceneKit – Stretched texture on a Custom Geometry

Texture stretching happens due to a wrong texture mapping on the UV map. You have to use m41 (translate X) and m42 (translate Y) elements, containing in the fourth column, of SCNMatrix4. Неre's how a stretch looks like when matrix element m41 equals to zero:

material.diffuse.contentsTransform = .init(
m11: 0.04, m12: 0, m13: 0, m14: 0,
m21: 0, m22: 0.04, m23: 0, m24: 0,
m31: 0, m32: 0, m33: 1, m34: 0,
m41: 0, m42: 0, m43: 0, m44: 1)

Sample Image


Offsetting a texture

Everything's changed when you shift a texture along X axis:

material.diffuse.contentsTransform = .init(
m11: 0.04, m12: 0, m13: 0, m14: 0,
m21: 0, m22: 0.04, m23: 0, m24: 0,
m31: 0, m32: 0, m33: 1, m34: 0,
m41: 0.5, m42: 0, m43: 0, m44: 1)

Sample Image

Loading huge animated 3D model in SceneKit causes memory issues


High-poly models with huge texture maps are not suitable for robust AR experience. Augmented Reality frameworks (such as ARKit or ARCore) are very processor-intensive, so there's no need to additionally increase a burden on CPU, GPU and memory.

Why ARKit apps are so CPU-intensive?

Your ARKit app uses 4 sensors to track a surrounding environment at 60 fps, and it simultaneously renders (with a help of SceneKit or RealityKit) your animated 3D model with all the textures, lights and shadows, and, after that, it composites in real-time a 2D render of your model (in RGBAZ pattern) over high-res RGB video from rear camera. That's too much for your device, isn't it?

Hence, any high-poly models with huge textures not only cause a memory and CPU/GPU issues, but also very quickly drain your battery. And, please take into consideration - iPhone X has only 3 GB of RAM of which iOS uses more than 1 GB, so memory issues are quite possible in your particular case.

So, my recommendations for creating a 3D model for robust AR experience are the following:

  • Low-poly geometry (usually 10,000 polygons per model are fine)
  • UV-mapped Texture resolution – not more than 1024 x 1024 pix
  • Preferably, pre-baked UV-mapped shadows for static elements
  • Use of JPEG format with 0% compression for texture (PNG is larger)
  • Don't use too many PBR shaders (with metalness property) in your scene


Related Topics



Leave a reply



Submit