Expose LiDAR depth data in AVCapturePhoto

Originator:carlo.rapisarda
Number:rdar://FB7783324 Date Originated:2020-06-25
Status:Open Resolved:-
Product:AVFoundation Product Version:
Classification:Suggestion Reproducible:
 
Since iOS 11, developers have been able to access per-pixel depth data as part of an AVCapturePhoto instance; the measurements can be either relative (with dual cameras), or absolute (with the TrueDepth camera). Unfortunately though, there is currently no way to obtain depth data from the LiDAR sensor on the iPad Pro while taking a capture with AVFoundation.

ARKit 4 allows developers to obtain depth data from the LiDAR sensor, but these measurements are provided together with RGB data that is of lower quality, and lacks the metadata that is instead available in an AVCapturePhoto.

Exposing per-pixel depth data (with absolute accuracy) from the rear camera system in AVFoundation would enable numerous new applications – for instance, Computer Vision solutions that rely on real-world measurements to resolve ambiguities, but operate on full-resolution images.

Please consider adding support for LiDAR depth data in AVFoundation as part of the AVDepthData class.

--

Additional details containing confidential information were not included here.

Comments


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!