How to Use Environment Map in Arkit

ARKit – How to generate a worldMap for big environment?

If you wanna effectively move within the real environment with AR objects around you, you should use the whole developer's arsenal for precise positioning: Core Location framework (it provides services for determining a device’s geographic location, altitude, orientation, or position relative to a nearby iBeacon), iBeacon framework and certified hardware for it (interactive possibilities for location awareness of iBeacon is especially cool for indoor navigation), Vision and Core ML frameworks (designed for use of a trained machine learning model to classify input data like signs and images).

Before using the aforementioned frameworks you should track with iPhone the whole environment multiple times (every time adding new feature points to the existing array of features). Look at the picture below to imagine how point cloud looks like:

Sample Image

P.S.

In addition to the above, ARKit 4.0 offers to developers such tools as ARGeoTrackingConfiguration with ARGeoAnchors and Scene Reconstruction feature (works when your device equipped with a LiDAR scanner).

Using ARKit 2.0's AREnvironmentProbeAnchor.environmentTexture in Unity to generate reflection probes

As per @rickster's suggestion, I had a look at the ARKit 2.0 Unity plugin's implementation, and I managed to actually use the AREnvironmentProbeAnchor.environmentTexture inside my Unity scene.

MTLTexture has a property named textureType, which is an enum value and is .typeCubeArray for the texture returned by AREnvironmentProbeAnchor.environmentTexture. This is explained thoroughly on the MTLTexture documentation page.

A MTLTexture having a texturetype of type .typeCubeArray means that when you pass the pointer to this MTLTexture to the Unity side, you can use it to create a Cubemap, which you can then use as your reflection probes environment texture. Here's how things roughly work on the Unity side:

// You can pass the pointer to your MTLTexture as an IntPtr to the Unity side
[DllImport("__Internal")]
public static extern IntPtr GetEnvironmentTexture();

void AddNewProbe()
{
var texturePtr = GetEnvironmentTexture();
if (texturePtr == IntPtr.Zero)
{
continue;
}

var cubemap = Cubemap.CreateExternalTexture(0, TextureFormat.R8, false, texturePtr);
var probeComponent = AddComponent<ReflectionProbe>();
probeComponent.customBakedTexture = cubemap;
}

And on the iOS side, you just need to have a method with the name you declared on the Unity side, which returns a pointer to your MTLTexture. Its return type can be void* or id<MTLTexture>, both of them works fine. This method should be placed on your Unity bridge for it to be visible to the Unity side.

extern "C" void* GetEnvironmentTexture() {
AREnvironmentProbeAnchor* anchor = [self updatedEnvironmentProbeAnchor];
return (__bridge_retained void*) [anchor environmentTexture];
}

You can (and frankly, should) modify and improve this to fit your needs.

How to make natural lighting in ARKit?

You can add lighting to an SCNMaterial by choosing from one of the lightingModel parameters e.g:

Sample Image

To add one of these to an SCNMaterial all you need to do is the following:

material.lightingModel = .constant 

You can also make objects appear more realistic by making use of the following variable of an SCNView:

var autoenablesDefaultLighting: Bool { get set }

autoEnablesDefaultLighting is simply a Boolean value that determines whether SceneKit automatically adds lights to a scene or not.

By default this is set as false meaning that:

the only light sources SceneKit uses for rendering a scene are those
contained in the scene graph.

If on the other hand, this is set to true:

SceneKit automatically adds and places an omnidirectional light source
when rendering scenes that contain no lights or only contain ambient
lights.

To apply this setting to an SCNView therefore, all you need to do is use the following:

augmentedRealityScene.autoenablesDefaultLighting = true

In addition to these suggestions, you can also create different types of lights to add to your scene e.g:

Sample Image

func createDirectionalLight(){

let spotLight = SCNNode()
spotLight.light = SCNLight()
spotLight.scale = SCNVector3(1,1,1)
spotLight.light?.intensity = 1000
spotLight.castsShadow = true
spotLight.position = SCNVector3Zero
spotLight.light?.type = SCNLight.LightType.directional
spotLight.light?.color = UIColor.white
}

Hope this helps...

iOS ARKit saving map across multiple sessions

Nope.

There is no API for accessing the data ARKit uses internally for position/orientation tracking, nor for telling ARKit to save/restore such data itself.



Related Topics



Leave a reply



Submit