Realitykit - How to Set a Modelentity's Transparency

RealityKit – Material's Alpha transparency


RealityKit 1.0

.tintColor is a multiplier for .baseColor

If you have a .png file with a premultiplied alpha (RGB*A). all you need to do is to additionally use a tintColor instance property with alpha equal to 0.9999.

material.tintColor = UIColor(white: 1.0, alpha: 0.9999)


Here's how it looks like in a real code:

fileprivate func material() -> UnlitMaterial {

var material = UnlitMaterial()
material.baseColor = try! .texture(.load(named: "transparent.png"))
material.tintColor = UIColor(white: 1.0, alpha: 0.9999)
return material
}

override func viewDidLoad() {
super.viewDidLoad()

let sphere: MeshResource = .generateSphere(radius: 0.5)

let entity = ModelEntity(mesh: sphere,
materials: [material()])

let anchor = AnchorEntity()
anchor.orientation = simd_quatf(angle: .pi, axis: [0, 1, 0])

anchor.addChild(entity)
arView.scene.anchors.append(anchor)
}

P.S.

For me, it seems like a bug in RealityKit 1.0. I have no clues why method .load(named: "file.png") doesn't work as expected.


RealityKit 2.0

The same story about partially transparent textures is in RealityKit 2.0:

var material = SimpleMaterial()

material.color = try! .init(tint: .white.withAlphaComponent(0.9999),
texture: .init(.load(named: "semi.png", in: nil)))

tint parameter is a multiplier for texture as well.

RealityKit – How to access the property in a Scene programmatically?

Of course, you need to look for the required ModelEntity in the depths of the model's hierarchy.

Use this SwiftUI solution:

Sample Image

struct ARViewContainer: UIViewRepresentable {

func makeUIView(context: Context) -> ARView {

let arView = ARView(frame: .zero)
let pictureScene = try! Experience.loadPicture()
pictureScene.children[0].scale *= 4

print(pictureScene)

let edgingModel = pictureScene.masterpiece?.children[0] as! ModelEntity

edgingModel.model?.materials = [SimpleMaterial(color: .brown,
isMetallic: true)]

var mat = SimpleMaterial()

// Here's a great old approach for assigning a texture in iOS 14.5
mat.baseColor = try! .texture(.load(named: "MonaLisa", in: nil))

let imageModel = pictureScene.masterpiece?.children[0]
.children[0] as! ModelEntity
imageModel.model?.materials = [mat]

arView.scene.anchors.append(pictureScene)
return arView
}

func updateUIView(_ uiView: ARView, context: Context) { }
}

Sample Image

How to make RealityKit to show only CollisionComponents?

You can extend a standard functionality of RealityKit's ARView by using simple Swift extension:

import RealityKit
import ARKit

fileprivate extension ARView.DebugOptions {

func showCollisions() -> ModelEntity {

print("Code for visualizing collision objects goes here...")

let vc = ViewController()

let box = MeshResource.generateBox(size: 0.04)
let color = UIColor(white: 1.0, alpha: 0.15)
let colliderMaterial = UnlitMaterial(color: color)

vc.visualCollider = ModelEntity(mesh: box,
materials: [colliderMaterial])
return vc.visualCollider
}
}

...and then call this method in ViewController when you're tapping on a screen:

class ViewController: UIViewController {

@IBOutlet var arView: ARView!

let anchor = AnchorEntity()
var ballEntity = ModelEntity()
var visualCollider = ModelEntity()
var sphere: MeshResource?

@IBAction func onTap(_ sender: UITapGestureRecognizer) {

sphere = MeshResource.generateSphere(radius: 0.02)

let material = SimpleMaterial(color: .systemPink,
isMetallic: false)

ballEntity = ModelEntity(mesh: sphere!,
materials: [material])

let point: CGPoint = sender.location(in: arView)

guard let query = arView.makeRaycastQuery(from: point,
allowing: .estimatedPlane,
alignment: .any)
else { return }

let result = arView.session.raycast(query)

guard let raycastResult = result.first
else { return }

let anchor = AnchorEntity(raycastResult: raycastResult)
anchor.addChild(ballEntity)
arView.scene.anchors.append(anchor)

let showCollisions = arView.debugOptions.showCollisions() // here it is
ballEntity.addChild(showCollisions)

ballEntity.generateCollisionShapes(recursive: true)
}
}

Please consider, it's an approximate visualization. This code just shows you a way to go on.

Sample Image

Apply a custom texture to Plane entity using RealityKit 2.0

You have to implement brand-new parameters instead of deprecated arguments:

func createBoard() {

let planeMesh = MeshResource.generatePlane(width: 1,
height: 1,
cornerRadius: 0.05)

var material = SimpleMaterial()

material.color = try! .init(tint: .white,
texture: .init(.load(named: "img", in: nil)))
material.metallic = .init(floatLiteral: 1.0)
material.roughness = .init(floatLiteral: 0.5)

let modelEntity = ModelEntity(mesh: planeMesh,
materials: [material])

let anchor = AnchorEntity()
anchor.addChild(modelEntity)
arView.scene.anchors.append(anchor)
}

Also, you can use the following syntax:

var material = SimpleMaterial()
material.color.texture = .init(try! .load(named: "img", in: nil))

If you need mode details, read this post.

How to show image from gallery in RealityKit?

Try this. Take into consideration, a tint color is multiplied by an image – so, if tint's RGBA = [1,1,1,1], a result of multiplication will be an image itself (without tinting)...

import ARKit
import RealityKit

class ViewController: UIViewController {

@IBOutlet var arView: ARView!
var anchor: AnchorEntity!

override func viewDidLoad() {
super.viewDidLoad()

self.anchor = AnchorEntity(world: [0,0,-1])

let ball: MeshResource = .generateSphere(radius: 0.25)

var material = UnlitMaterial()

if #available(iOS 15.0, *) {

material.color = try! .init(tint: .white,
texture: .init(.load(named: "img",
in: nil)))
}

let ballEntity = ModelEntity(mesh: ball, materials: [material])

self.anchor.addChild(ballEntity)

self.arView.scene.anchors.append(self.anchor)
}
}

Transparency artifacts appearing on iOS

So I figured it out. The solution was to disable the "Use Depth Test" and "Write to Depth.." for all overlapping materials. After changing this everything worked as expected.

Sample Image



Related Topics



Leave a reply



Submit