Raw Depth map SDK for IPhone X
You'll need at least the iOS 11.1 SDK in Xcode 9.1 (both in beta as of this writing). With that, builtInTrueDepthCamera
becomes one of the camera types you use to select a capture device:
let device = AVCaptureDevice.default(.builtInTrueDepthCamera, for: .video, position: .front)
Then you can go on to set up an AVCaptureSession
with the TrueDepth camera device, and can use that capture session to capture depth information much like you can with the back dual camera on iPhone 7 Plus and 8 Plus:
Turn on depth capture for photos with
AVCapturePhotoOutput.
isDepthDataDeliveryEnabled
, then snap a picture withAVCapturePhotoSettings.
isDepthDataDeliveryEnabled
. You can read thedepthData
from theAVCapturePhoto
object you receive after the capture, or turn onembedsDepthDataInPhoto
if you just want to fire and forget (and read the data from the captured image file later).Get a live feed of depth maps with
AVCaptureDepthDataOutput
. That one is like the video data output; instead of recording directly to a movie file, it gives your delegate a timed sequence of image (or in this case, depth) buffers. If you're also capturing video at the same time,AVCaptureDataOutputSynchronizer
might be handy for making sure you get coordinated depth maps and color frames together.
As Apple's Device Compatibility documentation notes, you need to select the builtInTrueDepthCamera
device to get any of these depth capture options. If you select the front-facing builtInWideAngleCamera
, it becomes like any other selfie camera, capturing only photo and video.
Just to emphasize: from an API point of view, capturing depth with the front-facing TrueDepth camera on iPhone X is a lot like capturing depth with the back-facing dual cameras on iPhone 7 Plus and 8 Plus. So if you want a deep dive on how all this depth capture business works in general, and what you can do with captured depth information, check out the WWDC17 Session 507: Capturing Depth in iPhone Photography talk.
Save depth images from TrueDepth camera
There's no standard video format for "raw" depth/disparity maps, which might have something to do with AVCapture not really offering a way to record it.
You have a couple of options worth investigating here:
Convert depth maps to grayscale textures (which you can do using the code in the AVCamPhotoFilter sample code), then pass those textures to
AVAssetWriter
to produce a grayscale video. Depending on the video format and grayscale conversion method you choose, other software you write for reading the video might be able to recover depth/disparity info with sufficient precision for your purposes from the grayscale frames.Anytime you have a
CVPixelBuffer
, you can get at the data yourself and do whatever you want with it. UseCVPixelBufferLockBaseAddress
(with thereadOnly
flag) to make sure the content won't change while you read it, then copy data from the pointerCVPixelBufferGetBaseAddress
provides to wherever you want. (Use other pixel buffer functions to see how many bytes to copy, and unlock the buffer when you're done.)Watch out, though: if you spend too much time copying from buffers, or otherwise retain them, they won't get deallocated as new buffers come in from the capture system, and your capture session will hang. (All told, it's unclear without testing whether a device has the memory & I/O bandwidth for much recording this way.)
Related Topics
Xcode Debug View Hierarchy: Unable to Capture View Hierarchy
iOS Add Button to Widget Extension
How to Programmatically Open Apple Watch Companion App from iOS App
Uicollectionviewcompositionallayout - Center Items in Sections or Groups
iOS How to Get a List of Already Purchased Products
Using Shader Modifiers to Animate Texture in Scenekit Leads to Jittery Textures Over Time
How to Detect If My Device Is an iPhone x in Swift 4
Why [Uiscreen Mainscreen].Bounds] Is Not Returning Full Screen Size
Form Nspredicate from String That Contains Id's
Launching App from Sms Link or Email Link
How to Use Indexesofobjectspassingtest: in Swift
Removing Parentheses from The String in iOS
Pass Extra Argument for UItapgesturerecognizer with Selector
Google Signin: Unable to Disconnect iOS App
How to Animate a Nslayoutconstraint in Swift
Detect Array with Issue (Debug Mode)
Uiactivityviewcontroller => Launchservices: Invalidationhandler Called