Expose ARKit sensor replay functionality in the public API for development purposes

Originator:argentumko
Number:rdar://44748748 Date Originated:September 25, 2018
Status:Duplicate of 44231485 (Closed) Resolved:
Product:ARKit Product Version:
Classification:Suggestion Reproducible:
 
Since ARKit 1.0, the framework includes internal functionality to record and replay data collected in an AR session: ARRecordingTechnique, ARReplaySensor and other related SPI. While at the moment this functionality is likely used for internal purposes like QA testing, a tool such as this would be very useful for third-party developers too. It could radically simplify iterating over an AR user experience, and also enable automated testing of AR features: a session recorded indoors or outdoors could be replayed within the debug build of the app, simulating real usage, with expectations around app’s behaviour under recorded circumstances. Combining this with existing UI testing mechanisms in Xcode, even user interaction with the scene could be simulated and validated, without having to constantly rely on a human tester and correct reproduction of the intended use case.

Comments

Useful for production

For us, this would be a great feature for our customers in production. We build apps that let users try on glasses, and ideally they would record a session while taking of their own glasses. Then we could replay the AR Session with different glasses, while the user can put their own glasses back on, to see properly.

ARKit 3 Supports ARKit replay

While ARKit 3 doesn't support full sensor recording, it does make testing possible with official tools. https://medium.com/@ethansaadia/record-replay-sessions-in-arkit-3-d81589b254c1


Please note: Reports posted here will not necessarily be seen by Apple. All problems should be submitted at bugreport.apple.com before they are posted here. Please only post information for Radars that you have filed yourself, and please do not include Apple confidential information in your posts. Thank you!