Does anyone know if there's a native library or API to align two point clouds in Swift/ARKit such as ICP? I know it's possible to do it in Python using the Open3D library http://www.open3d.org/docs/release/tutorial/pipelines/icp_registration.html
It seems like ARWorldMap would need some sort of similar capability to do tracking and resume the session from the same state later., so I'm curious if anyone knows how it works under the hood.
Or if you have any suggestions, I'd appreciate some pointers. Thank you.
Post
Replies
Boosts
Views
Activity
Similar post on StackOverflow and multiple people reported this (you will encounter it if you run the app for like 10 minutes). I'm hoping this could get Apple's attention somehow
After downloading the project code (https://developer.apple.com/documentation/avfoundation/additional_data_capture/capturing_depth_using_the_lidar_camera) and running the Swift sample code on an iPhone 14 Pro, the app crashes intermittently, throwing this error:
Execution of the command buffer was aborted due to an error during execution. Caused GPU Timeout Error (00000002:kIOGPUCommandBufferCallbackErrorTimeout)
Sometimes it will crash within a few seconds, sometimes it can take around 10 minutes.
Has anyone here experienced this crash from the sample code, or using the LiDAR camera?
I have spent a long time trying to solve this issue, I have searched the web high and low and submitted (I think) a report to Apple about it. I am unable to get xcode to show me the line of code where the crash is happening.
Any help would be greatly appreciated.