Save Depth Images from Truedepth Camera

Save depth images from TrueDepth camera

There's no standard video format for "raw" depth/disparity maps, which might have something to do with AVCapture not really offering a way to record it.

You have a couple of options worth investigating here:

  1. Convert depth maps to grayscale textures (which you can do using the code in the AVCamPhotoFilter sample code), then pass those textures to AVAssetWriter to produce a grayscale video. Depending on the video format and grayscale conversion method you choose, other software you write for reading the video might be able to recover depth/disparity info with sufficient precision for your purposes from the grayscale frames.

  2. Anytime you have a CVPixelBuffer, you can get at the data yourself and do whatever you want with it. Use CVPixelBufferLockBaseAddress (with the readOnly flag) to make sure the content won't change while you read it, then copy data from the pointer CVPixelBufferGetBaseAddress provides to wherever you want. (Use other pixel buffer functions to see how many bytes to copy, and unlock the buffer when you're done.)

    Watch out, though: if you spend too much time copying from buffers, or otherwise retain them, they won't get deallocated as new buffers come in from the capture system, and your capture session will hang. (All told, it's unclear without testing whether a device has the memory & I/O bandwidth for much recording this way.)

Does AVDepthData capture depth data from the selfie camera on iPhone X?

Yes. The WWDC 2017 session "Capturing Depth in iPhone Photography" (https://developer.apple.com/videos/play/wwdc2017/507/) covers this for the dual-camera set up on the back, and it uses AVDepthData to return depth information. The TrueDepth camera uses the same protocol.

how to get camera calibration data from the TrueDepth Camera iOS?

What you need for Vision is a CVPixelBuffer (among other options), which you get from photo.depthData.depthDataMap

let depthData = photo.depthData
let depthBuffer = depthData.depthDataMap //CVPixelBuffer (orientation needs to be handled separately)

if depthData.depthDataQuality == .low {
print("Low depth quality...")
}

if depthData.depthDataAccuracy == .relative {
print("Depth data not accurate (relative)")
}

To get UIImage from CVPixelBuffer - see this answer

How to capture depth data from camera in iOS 11 and Swift 4?

First, you need to use the dual camera, otherwise you won't get any depth data.

let device = AVCaptureDevice.default(.builtInDualCamera, for: .video, position: .back)

And keep a reference to your queue

let dataOutputQueue = DispatchQueue(label: "data queue", qos: .userInitiated, attributes: [], autoreleaseFrequency: .workItem)

You'll also probably want to synchronize the video and depth data

var outputSynchronizer: AVCaptureDataOutputSynchronizer?

Then you can synchronize the two outputs in your viewDidLoad() method like this

if sessionOutput?.isDepthDataDeliverySupported {
sessionOutput?.isDepthDataDeliveryEnabled = true
depthDataOutput?.connection(with: .depthData)!.isEnabled = true
depthDataOutput?.isFilteringEnabled = true
outputSynchronizer = AVCaptureDataOutputSynchronizer(dataOutputs: [sessionOutput!, depthDataOutput!])
outputSynchronizer!.setDelegate(self, queue: self.dataOutputQueue)
}

I would recommend watching WWDC session 507 - they also provide a full sample app that does exactly what you want.

https://developer.apple.com/videos/play/wwdc2017/507/

Can I require a user to have a True Depth camera to download my app from the App Store?

No.

Apple's mechanism for segregating App Store listings by device capabilities does include a front-depth-camera key. However, that key is not enabled for use by third-party apps, and Apple doesn't include it in the list of device capabilities that third-party apps can use to limit App Store availability of an app. If you include that key in your app's Info.plist, it has no effect on the App Store — your app will still be offered to devices without a TrueDepth camera.

Unless/until that changes, you can't really make an app that absolutely requires the TrueDepth camera. App Store guidelines require that baseline app functionality is the same across supported devices.

Instead, treat features based on the depth camera as secondary or supplementary to your app's core feature set — for example, if you have an app that adds visual effects to selfie-cam imagery, offer ARKit-based effects on devices that support face tracking and simpler effects on devices that don't. (Check ARFaceTrackingConfiguration.isSupported to see if you're running on the right hardware.)



Related Topics



Leave a reply



Submit