ARMeshGeometry detail

So I started tinkering with the example provided here: https://developer.apple.com/documentation/arkit/world_tracking/visualizing_and_interacting_with_a_reconstructed_scene


The document states, "The LiDAR Scanner quickly retrieves depth information from a wide area in front of the user". Can someone clarify what you mean by wide?


I'm interested in exporting a detailed mesh representation of an Xbox One controller. The mesh that was generated for this controller is not detailed enough for my use case. How can I get a more detailed represenation? Is the hardware capable of this? I've included some examples below with varying lengths.


https://ibb.co/Yf9jYHN 1/2ft Away

https://ibb.co/YTsZyCF 3ft Away

https://ibb.co/zx6LY87 2ft Away

I´m having this same question, I´m developing an app to offer 3d printings as replications of simple objects to the users, the app will allow to bind and customize multiple objects to create a unique item to print


For this I would need the detailed mesh from the object, of course iIm aware this would only work with the new Ipad pro with the Lidar scanner, but I´m wondering how can i get the detailed mesh


Am I missing some config that i need to set up on the app? or is this feature just not available yet on the ARKit?

Just curious, do you have plane detection turned on? I don't have a device to test on, but this video seems to suggest it has higher detail with it turned off, at around 5:20 https://developer.apple.com/videos/play/tech-talks/609/

I've tried without plane detection turned on. Same results. I really hope someone from Apple details the LiDAR capabilites. As it stands I would hold off buying the 2020 iPad if you want to scan small objects. The fine detail is missing in the mesh data.

Did you ever resolve the detail issue with lidar scanner? I dont suppose you know how to texture the mesh do you?

ARMeshGeometry detail
 
 
Q