Add Uiimage as Texture to a Plane in Realitykit

Add UIImage as texture to a Plane in RealityKit

Currently, you cannot use UIImage or CIImage as a shader's texture in RealityKit 2.0. In both versions of RealityKit, the texture must be loaded via the String type parameter of the load() method.

RealityKit 2.0

To assign a texture to a shader in RealityKit 2.0 use the following approach:

let mesh: MeshResource = .generatePlane(width: 0.45, depth: 0.45)

var material = SimpleMaterial()
material.color = .init(tint: .white.withAlphaComponent(0.999),
texture: .init(try! .load(named: "texture.png")))
material.metallic = .float(1.0)
material.roughness = .float(0.0)

let model = ModelEntity(mesh: mesh, materials: [material])

RealityKit 1.0

To assign a texture to a shader in RealityKit 1.0 use this approach:

let scene = try! Experience.loadMyScene()

var material = SimpleMaterial()
material.baseColor = try! .texture(.load(named: "texture.png"))
material.metallic = MaterialScalarParameter(floatLiteral: 1.0)
material.roughness = MaterialScalarParameter(floatLiteral: 0.0)
material.tintColor = UIColor.white

let mesh: MeshResource = .generatePlane(width: 0.45, depth: 0.45)
let component = ModelComponent(mesh: mesh, materials: [material])

scene.myFavoriteScene?.children[0].components.set(component)
arView.scene.anchors.append(scene)

CGImage

Nonetheless, you can create a texture resource from an in-memory Core Graphics image:

static func generate(from: CGImage, 
withName: String?,
options: TextureResource.CreateOptions) -> TextureResource

Also, you can use a URL parameter:

material.color.texture = .init(try! .load(contentsOf: url))  // RealityKit 2.0

Apply a custom texture to Plane entity using RealityKit 2.0

You have to implement brand-new parameters instead of deprecated arguments:

func createBoard() {

let planeMesh = MeshResource.generatePlane(width: 1,
height: 1,
cornerRadius: 0.05)

var material = SimpleMaterial()

material.color = try! .init(tint: .white,
texture: .init(.load(named: "img", in: nil)))
material.metallic = .init(floatLiteral: 1.0)
material.roughness = .init(floatLiteral: 0.5)

let modelEntity = ModelEntity(mesh: planeMesh,
materials: [material])

let anchor = AnchorEntity()
anchor.addChild(modelEntity)
arView.scene.anchors.append(anchor)
}

Also, you can use the following syntax:

var material = SimpleMaterial()
material.color.texture = .init(try! .load(named: "img", in: nil))

If you need mode details, read this post.

How to show image from gallery in RealityKit?

Try this. Take into consideration, a tint color is multiplied by an image – so, if tint's RGBA = [1,1,1,1], a result of multiplication will be an image itself (without tinting)...

import ARKit
import RealityKit

class ViewController: UIViewController {

@IBOutlet var arView: ARView!
var anchor: AnchorEntity!

override func viewDidLoad() {
super.viewDidLoad()

self.anchor = AnchorEntity(world: [0,0,-1])

let ball: MeshResource = .generateSphere(radius: 0.25)

var material = UnlitMaterial()

if #available(iOS 15.0, *) {

material.color = try! .init(tint: .white,
texture: .init(.load(named: "img",
in: nil)))
}

let ballEntity = ModelEntity(mesh: ball, materials: [material])

self.anchor.addChild(ballEntity)

self.arView.scene.anchors.append(self.anchor)
}
}

How to Add Material to ModelEntity programatically in RealityKit?

Updated: June 14, 2022

RealityKit materials

There are 6 types of materials in RealityKit 2.0 and RealityFoundation at the moment:

  • SimpleMaterial
  • UnlitMaterial
  • OcclusionMaterial (read this post to find out how to setup SceneKit occlusion shader)
  • VideoMaterial (look at this post to find out how to setup it)
  • PhysicallyBasedMaterial
  • CustomMaterial

To apply these materials use the following logic:

import Cocoa
import RealityKit

class ViewController: NSViewController {
@IBOutlet var arView: ARView!

override func awakeFromNib() {
let box = try! Experience.loadBox()

var simpleMat = SimpleMaterial()
simpleMat.color = .init(tint: .blue, texture: nil)
simpleMat.metallic = .init(floatLiteral: 0.7)
simpleMat.roughness = .init(floatLiteral: 0.2)

var pbr = PhysicallyBasedMaterial()
pbr.baseColor = .init(tint: .green, texture: nil)

let mesh: MeshResource = .generateBox(width: 0.5,
height: 0.5,
depth: 0.5,
cornerRadius: 0.02,
splitFaces: true)

let boxComponent = ModelComponent(mesh: mesh,
materials: [simpleMat, pbr])

box.steelBox?.children[0].components.set(boxComponent)
box.steelBox?.orientation = Transform(pitch: .pi/4,
yaw: .pi/4,
roll: 0).rotation
arView.scene.anchors.append(box)
}
}

Sample Image

Read this post to find out how to load a texture for RealityKit's shaders.


How to create RealityKit's shaders similar to SceneKit's shaders

We know that in SceneKit there are 5 different shading models, so we can use RealityKit's SimpleMaterial, PhysicallyBasedMaterial and UnlitMaterial to generate all these five shaders that we've been accustomed to.

Let's see how it looks like:

SCNMaterial.LightingModel.blinn           – SimpleMaterial(color: . gray,
roughness: .float(0.5),
isMetallic: false)

SCNMaterial.LightingModel.lambert – SimpleMaterial(color: . gray,
roughness: .float(1.0),
isMetallic: false)

SCNMaterial.LightingModel.phong – SimpleMaterial(color: . gray,
roughness: .float(0.0),
isMetallic: false)

SCNMaterial.LightingModel.physicallyBased – PhysicallyBasedMaterial()

// all three shaders (`.constant`, `UnlitMaterial` and `VideoMaterial `)
// don't depend on lighting
SCNMaterial.LightingModel.constant – UnlitMaterial(color: .gray)
– VideoMaterial(avPlayer: avPlayer)

Adding CustomEntity to the Plane Anchor in RealityKit

To prevent world anchoring at [0, 0, 0], you don't need to conform to the HasAnchoring protocol:

class Box: Entity, HasModel {
// content...
}

So, your AnchorEntity(plane: .horizontal) are now active.

2D Drawing On a Virtual 3D Plane in RealityKit

I figured it out using the pinhole camera model.

if sender.state == .began || sender.state == .changed {
let result = self.arView.hitTest(touchInView)
guard let collision = result.first else { return }

// Convert the point from world space to the plane's local space.
let position = planeEntity.convert(position: collision.position,
from: nil)

// Get the focal length.
let intrinsics = (self.arView.session.currentFrame?.camera.intrinsics)!
let f_x = intrinsics.columns.0.x
let f_y = intrinsics.columns.1.y

// Co-ordinates of the principal point.
let c_x = intrinsics.columns.2.x
let c_y = intrinsics.columns.2.y

// (x, y, z) of the 3D point on the plane.
let x = position.x
let y = position.y
let z = collision.distance

// Map to 2D (u, v) co-ordinates.
let u = (x / z) * f_x + c_x
let v = (y / z) * f_y + c_y

// Shift the origin to the top left corner for Core Graphics.
let u_max = (planeEntity.model!.mesh.bounds.max.x / z) * f_x + c_x
let v_max = (planeEntity.model!.mesh.bounds.max.y / z) * f_y + c_y
let u_mapped = max(u, 0) // Clamp to avoid negative values.
let v_mapped = max(v_max - v, 0)

// Add this point to the UIBezierPath.
let mappedPoint = CGPoint(x: CGFloat(u_mapped),
y: CGFloat(v_mapped))

// Use this as the size of the image renderer.
let canvasSize = CGSize(width: u_max,
height: v_max)
}


Related Topics



Leave a reply



Submit