iOS - Arkit Node Disappear After 100M

ARKit removes a node when Reference Image disappeared

You have to use ARWorldTrackingConfiguration instead of ARImageTrackingConfiguration. It's quite bad idea to use both configurations in app because each time you switch between them – tracking state is reset and you have to track from scratch.

Let's see what Apple documentation says about ARImageTrackingConfiguration:

With ARImageTrackingConfiguration, ARKit establishes a 3D space not by tracking the motion of the device relative to the world, but solely by detecting and tracking the motion of known 2D images in view of the camera.

The basic differences between these two configs are about how ARAnchors behave:

  • ARImageTrackingConfiguration allows you get ARImageAnchors only if your reference images is in a Camera View. So if you can't see a reference image – there's no ARImageAnchor, thus there's no a 3D model (it's resetting each time you cannot-see-it-and-then-see-it-again). You can simultaneously detect up to 100 images.

  • ARWorldTrackingConfiguration allows you track a surrounding environment in 6DoF and get ARImageAnchor, ARObjectAnchor, or AREnvironmentProbeAnchor. If you can't see a reference image – there's no ARImageAnchor, but when you see it again ARImageAnchor is still there. So there's no reset.

Conclusion:

ARWorldTrackingConfiguration's cost of computation is much higher. However this configuration allows you perform not only image tracking but also hit-testing and ray-casting for detected planes, object detection, and a restoration of world maps.

Adding physicsBody makes the ar object disappear

Use isAffectedByGravity instance property to get rid of object's gravity:

var isAffectedByGravity: Bool { get set }

You need false value (by default it's true):

let sphere = SCNSphere(radius: 1)
let node = SCNNode(geometry: sphere)
let physicsBody = SCNPhysicsBody(type: .dynamic, shape: nil)

node.physicsBody = physicsBody
physicsBody.isAffectedByGravity = false
sceneView.scene.rootNode.addChildNode(node)

Project Point method: converting rawFeaturePoint to Screen Space

Theory

func projectPoint(_ point: simd_float3, 
orientation: UIInterfaceOrientation,
viewportSize: CGSize) -> CGPoint

Xcode tip says – instance method projectPoint(...) returns the projection of the specified point into a 2D pixel coordinate space whose origin is in the upper left corner and whose size matches that of the viewportSize parameter.

The difference between Screen Size and Viewport size is described Here and Here (I see you said you know about that).

Solution

The trick is that the 2D point is projected correctly only when the 3D point is inside the frustum's coverage area of ​​the camera – it's not a secret that the distance is calculated according to the Pythagorean theorem...

import ARKit

extension ViewController: ARSessionDelegate {

func session(_ session: ARSession, didUpdate frame: ARFrame) {

let point = simd_float3(0.3, 0.5,-2.0)

if self.sceneView.isNode(self.sphere,
insideFrustumOf: self.sceneView.pointOfView!) {

let pp = frame.camera.projectPoint(point,
orientation: .portrait,
viewportSize: CGSize(width: 375, height: 812))

self.label_A.text = String(format: "%.2f", pp.x / 375)
self.label_B.text = String(format: "%.2f", pp.y / 812)
}
}
}

As you can see, outputting values ​​in normalized coordinates (0.00 ... 1.00) is very simple:

class ViewController: UIViewController {

@IBOutlet var sceneView: ARSCNView!
@IBOutlet var label_A: UILabel!
@IBOutlet var label_B: UILabel!
let sphere = SCNNode(geometry: SCNSphere(radius: 0.1))

override func viewDidLoad() {
super.viewDidLoad()

sceneView.session.delegate = self
sceneView.scene = SCNScene()

sphere.geometry?.firstMaterial?.diffuse.contents = UIColor.green
sphere.position = SCNVector3(0.3, 0.5,-2.0)
sceneView.scene.rootNode.addChildNode(sphere)

let config = ARWorldTrackingConfiguration()
sceneView.session.run(config)
}
}

I used iPhone X parameters – vertical viewportSize is 375 x 812.



Related Topics



Leave a reply



Submit