@Francesco_Esimple from my experience, I was able to get the PhotogrammetrySession running on the server-side for processing and reconstructing 3D model from the data that was captured from my iPhone. The data captured were 1. HEIC image and 2. "Snapshot" folder that consisted of data from ObjectCaptureSession that was saved to its checkpointDirectory(Check out the docs here).
I had Vapor as my server-side swift framework which imported RealityKit's PhotogrammetrySession, set its checkpointDirectory property to the one saved to the server from the iPhone's ObjectCaptureSession's checkpointDirectory. By doing this I was able to utilize Point Cloud and other bits of data to reconstruct the 3D model on the server-side PhotogrammetrySession with the images.
I wasn't able to get access to inner bits of data that was saved to the checkpointDirectory though. The folder itself was random collection of folders and .bin files. And opening them up in hex converter, I was only able to decipher bundle paths like com.oc.PointCloud. Hopefully more features and documentation will come out to help us figure these things out.
Post
Replies
Boosts
Views
Activity
I found out that using .checkpointDirectory during ObjectCaptureSession for snapshotting current captures(including Point Cloud) and passing it to the server-side PhotogrammetrySession and setting its PhotogrammetrySession.Configuration.checkpointDirectory to snapshot URL uses the Point Cloud data captured during ObjectCaptureSession on iOS to be used during processing.
Importing the snapshot from checkpointDirectory significantly increases the quality of the result of featureless object shown above(my AirPods Pro). I believe this is the solution to utilizing Point Cloud data to my question.
Sorry folks, I know this is a duplicate from the other post I submitted. I thought the original post wasn't submitted. And there is no feature in Apple Developer Forum to delete a post...
If you really want to try out the framework before they release the sample app, I have created an app with basic usage of the Object Capture framework on iOS.
You can try it out here, and hopefully leave some issues to improve it.
I believe the new ObjectCaptureSession and ObjectCaptureView are only available on iOS 17.
iOS 17 is not available on Xcode 14.2.
This means that you are required to download Xcode 15.0(beta) link here in order to try out the new ObjectCaptureSession and ObjectCaptureView APIs.