I donnot know how to run it
How to run Object Capture App?
I'm just gonna take it all from the beginning:
-
Check that your computer is capable of running the app. The first I tried to run it on (a 13" MacBook Pro from 2016) wasn't strong enough. I'm not sure of the details, but remember reading somewhere that it requires 4GB vram.
-
Update MacOS to Monterey with your Apple Developer account from here: https://developer.apple.com/download/
-
Download XCode 13.0 Beta from here as well: https://developer.apple.com/download/
-
Download the Photogrammetry example app from here and unzip it: https://developer.apple.com/documentation/realitykit/creating_a_photogrammetry_command-line_app
-
Open the HelloPhotogrammetry.xcodeproj file using XCode 13.0 Beta (Wait for it to finish indexing the project)
-
In the menu bar, select Product > Scheme > Edit Scheme
-
Select Run in the left sidebar, and in the main area under "Aruments passed at launch" fill in 3 lines:
- /Users/YOURUSER/Desktop/FOLDERWITHPHOTOS
- /Users/YOURUSER/Desktop/OUTPUTNAME.usdz
- -d reduced
-
Close the scheme settings and press the play button at the top of XCode
You should see the console output the process. Notice that now and then it outputs this part:
Progress(request = modelFile(url: file:///Users/YOURUSER/Desktop/OUTPUTNAME.usdz, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.reduced, geometry: nil) = 0.156666666865348816
That last decimal is actually the current processing status. When it reaches 1.0 your model is complete and will automatically be saved.
For different qualities use different detail settings:
- -d preview
- -d reduced
- -d medium
- -d full
- -d raw
In my case, even with Arguments Passed, I am getting error:
`Using configuration: Configuration(isObjectMaskingEnabled: true, sampleOverlap: RealityFoundation.PhotogrammetrySession.Configuration.SampleOverlap.normal, sampleOrdering: RealityFoundation.PhotogrammetrySession.Configuration.SampleOrdering.unordered, featureSensitivity: RealityFoundation.PhotogrammetrySession.Configuration.FeatureSensitivity.normal)
2021-06-10 11:33:13.182288+1000 HelloPhotogrammetry[3003:285460] Metal API Validation Enabled
Error creating session: cantCreateSession("Native session create failed: CPGReturn(rawValue: -11)")
Program ended with exit code: 1`
cantCreateSession("Native session create failed: CPGReturn(rawValue: -11)")
This error seems to mean, that your hardware isn't supported
I was able to run at all detail levels except -d full
any ideas on why that might be the case?
Looking through the output, it does seem like it went to 100% in terms of processing, but only towards the very end, it tosses out an error. Could this be a Ram restriction due to only having 8gb (M1 Macbook Air)?
Progress(request = modelFile(url: file:///Users/XXXXXXX/Desktop/output.usdz, detail: RealityFoundation.PhotogrammetrySession.Request.Detail.full, geometry: nil) = 1.0
-[MTLDebugBuffer didModifyRange:]:473: failed assertion `didModifyRange: only applies when resourceOptions(0x1) & MTLResourceStorageModeMask(0xf0) == MTLResourceStorageModeManaged(0x10). NOT MTLResourceStorageModeShared'
@domainxh I ran into the same problem as you. Have only tried -full once, but it also failed.
created a post for it here: https://developer.apple.com/forums/thread/682259
Thank you very much for breaking this down, although I was operating the tool over in terminal like a brainiac.
Anyway, my brain has stumped me, I am getting the following error despite which method I choose to run:
2021-06-14 16:45:33.028717-0500 HelloPhotogrammetry[2759:55337] [Photogrammetry] No SfM map found in native output!
HelloPhotogrammetry:
RealityFoundation.PhotogrammetrySession.Request.Detail.reduced, geometry: nil) had an error: reconstructionFailed("Reconstruction failed!")
The file that I am referencing contains the depth .TIF, gravity .TXT, and image (HEIC) - taken with CaptureSample. Is the quality of the images not good enough?
Thank you for the support
Edit: I also noticed the following:
HelloPhotogrammetry[2745:51319] [espresso] ANE Batch: Async request 1 returned error: code=5 err=Error Domain=com.apple.appleneuralengine Code=5 "processRequest:model:qos:qIndex:modelStringID:options:error:: 0xd: Program Inference overflow" UserInfo={NSLocalizedDescription=processRequest:model:qos:qIndex:modelStringID:options:error:: 0xd: Program Inference overflow}
I know this has been here for a while, but I got this with the beta2 version, is there a way to get the first xcode 13 beta version?
This worked for me in the output of the USDZ file. But has anyone been able to get the test app to work for taking pictures? All I have gotten is a "Hello World" screen to load with no function after that