Lidar and Realitykit - Capture a Real World Texture For a Scanned Model

LiDAR and RealityKit – Capture a Real World Texture for a Scanned Model

Scene Reconstruction

Pity but I am still unable to capture model's texture in realtime using the LiDAR scanning process. Neither at WWDC20 nor at WWDC22 Apple announced a native API for that (so texture capturing is only possible now using third-party APIs - don't ask me which ones :-) ).

However, there's good news – a new methodology has emerged at last. It will allow developers to create textured models from a series of shots.

Photogrammetry

Object Capture API, announced at WWDC 2021, provides developers with the long-awaited photogrammetry tool. At the output we get USDZ model with UV-mapped hi-rez texture. To implement Object Capture API you need macOS 12 and Xcode 13.

Sample Image

To create a USDZ model from a series of shots, submit all taken images to RealityKit's PhotogrammetrySession.

Here's a code snippet that spills a light on this process:

import RealityKit
import Combine

let pathToImages = URL(fileURLWithPath: "/path/to/my/images/")

let url = URL(fileURLWithPath: "model.usdz")

var request = PhotogrammetrySession.Request.modelFile(url: url,
detail: .medium)

var configuration = PhotogrammetrySession.Configuration()
configuration.sampleOverlap = .normal
configuration.sampleOrdering = .unordered
configuration.featureSensitivity = .normal
configuration.isObjectMaskingEnabled = false

guard let session = try PhotogrammetrySession(input: pathToImages,
configuration: configuration)
else { return 
}

var subscriptions = Set<AnyCancellable>()

session.output.receive(on: DispatchQueue.global())
.sink(receiveCompletion: { _ in
// errors
}, receiveValue: { _ in
// output
})
.store(in: &subscriptions)

session.process(requests: [request])

You can reconstruct USD and OBJ models with their corresponding UV-mapped textures.

Using iPad LiDAR and importing mesh into RealityKit

Using free 3D Scanner App you can definitely export a scanned 3D model with textures saved in a number of popular formats including usdz or obj via AirDrop, and then open it on macOS. And, of course, that model can be loaded in RealityKit.

Scanning Real-World Object and generating 3D Mesh from it

RealityKit 2.0 | Object Capture API

Object Capture API, announced at WWDC 2021, provides you with the long-awaited tools for photogrammetry. At the output we get USDZ model with a hi-res texture.

Read about photogrammetry HERE.

ARKit | Mesh Reconstruction

Using iOS device with LiDAR and ARKit 3.5/4.0/5.0 you can easily reconstruct a topological map of surrounding environment. Scene Reconstruction feature starts working immediately after launching a current ARSession.

Apple LiDAR works within 5 meters range. A scanner can help you improve a quality of ZDepth channel, and such features as People/Real World Objects Occlusion, Motion Tracking, Immediate Physics Contact Body and Raycasting.

Other awesome peculiarities of LiDAR scanner are:

  • you can use your device in a poorly lit room
  • you can track a pure white walls with no features at all
  • you can detect a planes almost instantaneously

Consider that a quality of a scanned object when you're using LiDAR isn't as good as you expect. Small details are not scanned. That's because a resolution of an Apple LiDAR isn't high enough.



Related Topics



Leave a reply



Submit