Resetting ARKit coordinates
If you want to reset you ARSession, you have to pause, remove all nodes and rerun your session by resetting tracking and removing anchors.
I made a reset button that does it whenever i want to reset:
@IBAction func reset(_ sender: Any) {
sceneView.session.pause()
sceneView.scene.rootNode.enumerateChildNodes { (node, stop) in
node.removeFromParentNode()
}
sceneView.session.run(configuration, options: [.resetTracking, .removeExistingAnchors])
}
Or put it in your session was interrupted function!
ARKit - Update only the world coordinates origin
According to this post link, the documentation of the rootNode says:
You should not modify the transform property of the root node.
I still tried affecting the camera position to the position of the rootNode but it didn't change anything.
So it seems like the only way is to create a new node and use it as a rootNode from where we are.
Remove text node when resetting the session in ARKit 1.5
I believe the way that you want to handle the removal of your SCNNodes
is to call the following function in your resetFunction
:
self.augmentedRealityView.scene.rootNode.enumerateChildNodes { (existingNode, _) in
existingNode.removeFromParentNode()
}
Whereby augmentedRealityView
refers to an ARSCNView
.
You could also store specific nodes in an array of [SCNNode]
and then loop through them to remove them e.g:
for addedNode in nodesAdded{
addedNode.removeFromParentNode()
}
Hope it helps...
Why anchors' position doesn't change after ARKit reconciling the recorded world map with the current environment?
I found the cause. The initial session, whose initialWorldMap
is nil
and run option is 0
, started after I restarted the session for reconciling. So it conflicted with the reconciling session. Which cancels reconciling. Thus the positions of loaded anchors will not change because they are unknown to the ARSession
.
The coordinate system of ARKit unstable
ARKit
/RealityKit
world tracking system is based on a combination of five sensors:
- Rear RGB Camera
- LiDAR Scanner
- Gyroscope
- Accelerometer
- Magnetometer
Three latter ones are known as Inertial Measurement Unit
(IMU) that operates at 1000 fps. But what sees your RGB Camera (running at 60 fps) and LiDAR (also at 60 fps) is very important too.
Hence, a stability of world tracking greatly depends on camera image.
Here are some recommendations for high-quality tracking:
- Track only well-lit environment (if you don't have LiDAR)
- Track only static objects (not moving)
- Don't track poor-textured surfaces like white walls (if you don't have LiDAR)
- Don't track surfaces with repetitive texture pattern (like Polka Dots)
- Don't track mirrors, chrome and glass objects (reflective and refractive)
- Move your iPhone slowly when tracking
- Don't shake iPhone when tracking
- Track as much environment as possible
- Track high-contrast objects in environment (if you don't have LiDAR)
If you follow these recommendations, coordinate system in ARKit will be stable.
And look at the picture in this SO post – there are a good example for tracking and a bad one.
Related Topics
How to Split a String into a [String] and Not [Substring]
Avoid Automatic Framework Linking in Swift
Swift 2 for Loop Low and High Date Between
Swift Firebase Using an Unspecified Index
Swift Any Difference Between Closures and First-Class Functions
How to Make a Segue to Second Item of Tab Bar
Play Mp4 Using Mpmovieplayercontroller() in Swift
How to Test Required Init(Coder:)
Troubles Using Cgpathcontainspoint Swift
How to Create a Hotspot Network in iOS App Using Swift
Adding Index List and Section Headers to Translated Tableview
Xcode 7 Run on MAC Osx 10.10 Yosemite
Background Upload with Share Extension
Swiftui Navigation: How to Switch Detail View to a Different Item
Expression Resolves to an Unused Function
Zoom to Fit Current Location and Annotation on Map
Explit Conformance to Codable Removes Memberwise Initializer Generation on Structs