So I started tinkering with the example provided here: https://developer.apple.com/documentation/arkit/world_tracking/visualizing_and_interacting_with_a_reconstructed_scene
The document states, "The LiDAR Scanner quickly retrieves depth information from a wide area in front of the user". Can someone clarify what you mean by wide?
I'm interested in exporting a detailed mesh representation of an Xbox One controller. The mesh that was generated for this controller is not detailed enough for my use case. How can I get a more detailed represenation? Is the hardware capable of this? I've included some examples below with varying lengths.
https://ibb.co/Yf9jYHN 1/2ft Away
https://ibb.co/YTsZyCF 3ft Away
https://ibb.co/zx6LY87 2ft Away