How to Deallocate Realitykit Arview()

Why does RealityKit memory does not clear after deinit called?

Tested in Xcode 14.0...

Alas, RealityKit 2.0 / 1.0 developers can't deallocate ARView from heap memory because there is a poor-quality implementation on Apple's part. Even if you declare arView property as weak in SwiftUI, this instance will be immediately deallocated, so you can't use it. In UIKit, however, declaring arView as weak has no effect.

As you remember, ARC was firstly introduced in Objective-C and later implemented in Swift. It's quite possible that the weak and unowned functionality is not implemented for ARView for some reason, perhaps because RealityKit is tailored for Swift only – @objc attribute explicitly states this. It looks like ARView reference isn't tracked by ARC.

@available(macOS 10.15, iOS 13.0, *)
@objc open class ARView : ARViewBase { ... }

If you need more info, please read this post for details.

Swift ARKit How do you fully kill an ARSession?

Officially, right now, you can't.

However, there is a work-around: you make the ARSCNView disposable.

On leaving AR, first pause the ARSession. Then deallocate the entire ARSCNView hierarchy & set ARSCNView to nil for good measure. Rebuild the entire ARSCNView hierarchy and start a new session whenever you need to go back to AR.

var ARview: ARSCNView?

func punchTheClown() {

ARView?.session.pause()
ARView?.removeFromSuperview()
ARView = nil

}

Other non-AR areas of your app would typically be in a separate view hierarchy at the same sibling level as your ARSCNView. I look forward to Apple providing an actual stopSession() function, so we can all stop having to punchTheClown in the meantime.

RealityKit – Cannot load ARView (found nil)

This test code does the trick. If you're building AR project from scratch (by choosing simple iOS App template, not ARKit template), make sure you added Privacy - Camera Usage Description and Required device capabilities (Item 0 = ARKit) in info.plist file.

import RealityKit
import UIKit

class ViewController: UIViewController {

var arView: ARView = {
let arView = ARView(frame: .zero)
let boxAnchor = try! Experience.loadBox()
arView.scene.anchors.append(boxAnchor)
return arView
}()

override func viewDidLoad() {
super.viewDidLoad()
self.view.backgroundColor = .red

self.view.addSubview(self.arView)

NSLayoutConstraint.activate([
arView.topAnchor.constraint(equalTo: self.view.topAnchor),
arView.leadingAnchor.constraint(equalTo: self.view.leadingAnchor),
arView.bottomAnchor.constraint(equalTo: self.view.bottomAnchor),
arView.trailingAnchor.constraint(equalTo: self.view.trailingAnchor)
])
self.view.subviews.forEach {
$0.translatesAutoresizingMaskIntoConstraints = false
}
}
}

Also I've noticed you added sceneView twice.

How do I stop ARView session on present modally segue using storyboard?

You turned off not everything that needed to be turned off.

func leaveScene() {

arView?.session.pause()
arView?.session.delegate = nil
arView?.scene.anchors.removeAll()
arView?.removeFromSuperview()
arView?.window?.resignKey()
arView = nil
}

P.S. But arView will not be deallocated from memory.

How do I use a RealityKit ARView with ARImageTrackingConfiguration?

Here is an example of a RealityKit ARView using ARImageTrackingConfiguration and the ARSessionDelegate delegate methods. I didn't see a complete example of exactly this on Stack Overflow so thought I would ask/answer it myself.

import ARKit
import RealityKit

class ViewController: UIViewController, ARSessionDelegate {

@IBOutlet var arView: ARView!

override func viewDidLoad() {
super.viewDidLoad()

// There must be a set of reference images in project's assets
guard let referenceImages = ARReferenceImage.referenceImages(inGroupNamed: "AR Resources", bundle: nil) else { fatalError("Missing expected asset catalog resources.") }

// Set ARView delegate so we can define delegate methods in this controller
arView.session.delegate = self

// Forgo automatic configuration to do it manually instead
arView.automaticallyConfigureSession = false

// Show statistics if desired
arView.debugOptions = [.showStatistics]

// Disable any unneeded rendering options
arView.renderOptions = [.disableCameraGrain, .disableHDR, .disableMotionBlur, .disableDepthOfField, .disableFaceOcclusions, .disablePersonOcclusion, .disableGroundingShadows, .disableAREnvironmentLighting]

// Instantiate configuration object
let configuration = ARImageTrackingConfiguration()

// Both trackingImages and maximumNumberOfTrackedImages are required
// This example assumes there is only one reference image named "target"
configuration.maximumNumberOfTrackedImages = 1
configuration.trackingImages = referenceImages
// Note that this config option is different than in world tracking, where it is
// configuration.detectionImages

// Run an ARView session with the defined configuration object
arView.session.run(configuration)
}

func session(_ session: ARSession, didAdd anchors: [ARAnchor]) {

// This example assumes only one reference image of interest
// A for-in loop could work for more targets

// Ensure the first anchor in the list of added anchors can be downcast to an ARImageAnchor
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }

// If the added anchor is named "target", do something with it
if let imageName = imageAnchor.name, imageName == "target" {

// An example of something to do: Attach a ball marker to the added reference image.
// Create an AnchorEntity, create a virtual object, add object to AnchorEntity
let refImageAnchor = AnchorEntity(anchor: imageAnchor)
let refImageMarker = generateBallMarker(radius: 0.02, color: .systemPink)
refImageMarker.position.y = 0.04
refImageAnchor.addChild(refImageMarker)

// Add new AnchorEntity and its children to ARView's scene's anchor collection
arView.scene.addAnchor(refImageAnchor)
// There is now RealityKit content anchored to the target reference image!

}
}

func session(_ session: ARSession, didUpdate anchors: [ARAnchor]) {
guard let imageAnchor = anchors[0] as? ARImageAnchor else { return }
// Assuming only one reference image. A for-in loop could work for more targets

if let imageName = imageAnchor.name, imageName == "target" {
// If anything needs to be done as the ref image anchor is updated frame-to-frame, do it here

// E.g., to check if the reference image is still being tracked:
// (https://developer.apple.com/documentation/arkit/artrackable/2928210-istracked)
if imageAnchor.isTracked {
print("\(imageName) is tracked and has a valid transform")
} else {
print("The anchor for \(imageName) is not guaranteed to match the movement of its corresponding real-world feature, even if it remains in the visible scene.")
}
}
}

// Convenience method to create colored spheres
func generateBallMarker(radius: Float, color: UIColor) -> ModelEntity {
let ball = ModelEntity(mesh: .generateSphere(radius: radius), materials: [SimpleMaterial(color: color, isMetallic: false)])
return ball
}

}

How to use setWorldOrigin with ARView?

ARSession is an augmented reality object, so all session's properties and methods are working only when session is running. ARSession is meaningless in .nonAR mode, VR mode, and in the Xcode Simulator. Use your code in AR mode only.


Notwithstanding, a solution for VR scenario may look like this:

import UIKit
import RealityKit

class Camera: Entity, HasPerspectiveCamera, HasAnchoring {
required init() {
super.init()

self.camera = PerspectiveCameraComponent(near: 0.01, far: 200.00,
fieldOfViewInDegrees: 50.0)
self.transform.translation.y = 0.5
self.transform.translation.z = 2.0
}
}

class ViewController: UIViewController {

@IBOutlet var arView: ARView!

override func viewDidLoad() {
super.viewDidLoad()

arView.environment.background = .color(.black)
let camera = Camera()
arView.scene.addAnchor(camera)

let entity = ModelEntity(mesh: .generateBox(size: 0.1))
let anchor = AnchorEntity(world: .zero)
anchor.addChild(entity)
arView.scene.addAnchor(anchor)
}
}


Related Topics



Leave a reply



Submit