Meet Object Capture for iOS

RSS for tag

Discuss the WWDC23 Session Meet Object Capture for iOS

Posts under wwdc2023-10191 tag

27 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

With in Apple PhotogrammterySession, Variable related with real scale.
In ARKit, I took few Color CVPixelBuffers and Depth CVPixelBuffers, ran PhotogrammetrySession with PhotogrammetrySamples. In my service, precise real scale is important, so I tried to figure out what is related to the rate of real scale model created. I did some experiments, and I set same number of images(10 pics), same object, same shot angles, distance to object(30cm, 50cm, 100cm). But even with above same controlled variables, sometimes, it generate real scale, and sometimes not. Because I couldn't get to source code of photogrammetry and how it work inside, I wonder do I miss and how can I create real scale every time if it's possible.
1
0
539
Sep ’23
Graphic Engineer
Hey There, I recently tried out the iOS 17 photogrammetry sample app, The results are very promising when compared to the iOS 16 apps The real world scale retention works amazing. However, my use case involves making the camera still and rotating the object instead, which was an option in iOS 16 but unfortunately removed in iOS 17 I wonder if there's a way to do so in iOS 17 app!
1
0
497
Sep ’23
Object Capture API on mac with Lidar data from iOS to get real life size of the objects
Hi, We are searching a solution to create 3D models in real life size using reflex cameras. We created an app for mac called Smart Capture that is using Object Capture to recreate 3D models from pictures. We used this project to digitize 5000 archeological findings of the Archaeological Park of Pompeii. We created a strong workflow using Orbitvu automated photography boxes with 3 reflex cameras for each box to speed up the capture process that allowed us to get a 3D model in less than 10 minutes (2-3 minutes to capture and about 7-8 minutes to process on a m2 max). The problem is that the resulting object has no size information and we have to manually take measurement and resize the 3d model accordingly, introducing a manual step and a possible error on the workflow. I was wondering if it's possible, using iOS 17 Object Capture APIs to get point cloud data which I could add to the reflex cameras pictures and process the whole package on the mac to retrieve the size of the real object. As far as I understood the only way to get it working before iOS 17 was to use depth information (I tried Sample Capture project), but the problem is that we have to work with small objects up to huge objects (our range is objects from about 1 to 25 inches) Do you have any clue on how to achieve this?
1
0
556
Sep ’23
Paste Permission Localisation Bug ? Can we override this permission localisation ?
When I am about to access the clipboard, the apple paste permission will prompt and ask for permission. But the localisation seem won't change the language if I change the phone language ? Scenario : If my phone at "English" Language for the first time, the paste permission will prompt in "English" which is correct, but then I switch the phone language to "Spanish", the paste permission prompt still in "English". I need to restart the phone, then only the prompt permission will be appear in "Spanish" language. If I switch back to "English", the prompt still remain in "Spanish" Language until I restart the phone. Any way we can override this in plist like other privacy permission ? Or this is a known bugs ? In iOS 16.6 I will attached the screenshot. Anyone can answer and help on this? Thank you so much.
1
0
643
Aug ’23
Unable to run the sample code
Hello, after installing Xcode 15 beta and the sample project provided for object capture in wwdc23 I am getting the below error: dyld[2006]: Symbol not found: _$s19_RealityKit_SwiftUI20ObjectCaptureSessionC7Combine010ObservableE0AAMc Referenced from: <35FD44C0-6001-325E-9F2A-016AF906B269> /private/var/containers/Bundle/Application/776635FF-FDD4-4DE1-B710-FC5F27D70D4F/GuidedCapture.app/GuidedCapture Expected in: <6A96F77C-1BEB-3925-B370-266184BF844F> /System/Library/Frameworks/_RealityKit_SwiftUI.framework/_RealityKit_SwiftUI I am trying to run the sample project on an iPhone 12 Pro (iOS 17.0 (21A5291j)) Any help in solving this issue would be appreciated. Thank you.
4
0
925
Aug ’23
Does anyone actually notice any improvements using the new ObjectCaptureSession with PhotogrammetrySession?
We have implemented all the recent additions Apple made for this on the iOS side for guided capture using Lidar and image data via ObjectCaptureSession. After the capture finishes we are sending our images to PhotogrammetrySession on macOS to reconstruct models in higher quality (Medium) than the Preview quality that is currently supported on iOS. We have now done a few side by side captures of using the new ObjectCapureSession vs using the traditional capture via the AvFoundation framework but have not seen any improvements that were claimed during the session that Apple hosted at WWDC. As a matter of fact we feel that the results are actually worse because the images obtained through the new ObjectCaptureSession aren't as high quality as the images we get from AvFoundation. Are we missing something here? Is PhotogrammetrySession on macOS not using this new additional Lidar data or have the improvements been overstated? From the documentation it is not clear at all how the new Lidar data gets stored and how that data transfers. We are using iOS 17 beta 4 and macOS Sonoma Beta 4 in our testing. Both codebases have been compiled using Xcode 15 Beta 5.
1
0
611
Jul ’23
Custom UI for ObjectCaptureView
Is it possible for me to customize the ObjectCaptureView? I'd like to have the turn-table that indicates whether the photo was captured with point cloud image to have different foreground color. So I want the white part under the point clouds to be some other color that I specify. Would it be possible by extending the ObjectCapturePointCloudView?
0
0
488
Jul ’23
AppDataModel is retained
Hi, In the scanning objects using object capture project, when the content view is dismissed the AppDataModel is always retained and the deinit is never called. @StateObject var appModel: AppDataModel = AppDataModel.instance I am presenting the contentView using a UIHostingController let hostingController = UIHostingController(rootView: ContentView()) hostingController.modalPresentationStyle = .fullScreen present(hostingController, animated: true) I have tried to manually detach the listeners and setting the objectCaptureSession to nil. In the debug memory graph there is a coachingoverlay retaining the AppDataModel. I want to remove the appModel from memory when the contentView is dismissed. Any suggestions?
2
0
675
Jul ’23
WWDC 23 Object Capture 2023
When running the code from the object capture event from WWDC 23 event I'm currently getting the error "dyld[607]: Symbol not found: _$s21DeveloperToolsSupport15PreviewRegistryPAAE7previewAA0D0VvgZ Referenced from: <411AA023-A110-33EA-B026-D0103BAE08B6> /private/var/containers/Bundle/Application/9E9526BF-C163-420D-B6E0-2DC9E02B3F7E/ObjectCapture.app/ObjectCapture Expected in: <0BD6AC59-17BF-3B07-8C7F-6D9D25E0F3AD> /System/Library/Frameworks/DeveloperToolsSupport.framework/DeveloperToolsSupport"
1
0
524
Jul ’23
Crash when using example from 'Meet Object Capture for iOS' WWDC2023
Hi. Each time when I am trying to capture object using example from session https://developer.apple.com/videos/play/wwdc2023/10191 I have a crash. iPhone 14 Pro Max, iOS 17 beta 3. Xcode Version 15.0 beta 3 (15A5195k) Log: ObjectCaptureSession.: mobileSfM pose for the new camera shot is not consistent. <<<< PlayerRemoteXPC >>>> fpr_deferPostNotificationToNotificationQueue signalled err=-12 785 (kCMBaseObjectError_Invalidated) (item invalidated) at FigPlayer_RemoteXPC.m:829 Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED Compiler failed with XPC_ERROR_CONNECTION_INTERRUPTED MTLCompiler: Compilation failed with XPC_ERROR_CONNECTION_INTERRUPTED on 3 try /Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSCore/Utility/MPSLibrary.mm:485: failed assertion `MPSLibrary::MPSKey_Create internal error: Unable to get MPS kernel NDArrayMatrixMultiplyNNA14_EdgeCase. Error: Compiler encountered an internal error ' /Library/Caches/com.apple.xbs/Sources/MetalPerformanceShaders/MPSCore/Utility/MPSLibrary.mm, line 485: error ''
3
0
814
Jul ’23
Object Capture With only manual capturing
Is it possible to capture only manually (automatic off) on object capture api ? And can I proceed to capturing stage right a way? Only Object Capture API captures real scale object. Using AVFoundation or ARKit, I've tried using lidar capturing HEVC or create PhotogrammetrySample, It doesn't create real scale object. I think, during object capture api, it catches point cloud, intrinsic parameter, and it help mesh to be in real scale. Does anyone knows 'Object Capture With only manual capturing' or 'Capturing using AVFoundation for real scale mesh'
2
0
1k
Sep ’23
Sample App crashed on iPad Pro (11-inch) (2nd-generation)
When I install and run the sample app Apple released just recently, everything works fine up until I try to start the capture. Bounding box sets up without a problem, but then every time, this error occurs: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] You are not authorized to use custom shutter sounds' *** First throw call stack: (0x19d6e8300 0x195cd4f30 0x1b9bfdcb4 0x1cc4fbf98 0x1cc432964 0x19d6e8920 0x19d70552c 0x1cc4328f8 0x1cc4a8fac 0x19d6e8920 0x19d70552c 0x1cc4a8e44 0x23634923c 0x23637abfc 0x2362d21a4 0x2362d139c 0x236339874 0x23636dc04 0x1a67f9b74 0x1a68023ac 0x1a67fa964 0x1a67faa78 0x1a67fa5d0 0x1039c6b34 0x1039d80b4 0x1a6800188 0x1a67f94bc 0x1a67f9fd0 0x1a6800098 0x1a67f9504 0x23633777c 0x23637201c 0x2354d081c 0x2354c8658 0x1039c6b34 0x1039c9c20 0x1039e1078 0x1039dfacc 0x1039d6ebc 0x1039d6ba0 0x19d774e94 0x19d758594 0x19d75cda0 0x1df4c0224 0x19fbcd154 0x19fbccdb8 0x1a142f1a8 0x1a139df2c 0x1a1387c1c 0x102a5d944 0x102a5d9f4 0x1c030e4f8) libc++abi: terminating due to uncaught exception of type NSException I have no idea why this is happening, so any help would be appreciated. My iPad is running the latest iPadOS 17 Beta and the crash also occurs when I don't have it isn't connected to Xcode...
1
0
851
Jul ’23
PhotogrammetrySession and PhotogrammetrySample
In WWDC 2021, It saids 'we also offer an interface for advanced workflows to provide a sequence of custom samples. A PhotogrammetrySample includes the image plus other optional data such as a depth map, gravity vector, or custom segmentation mask.' But in code, PhotogrammetrySession initialize with data saved directory. How can I give input of PhotogrammetrySamples to PhotogrammetrySession?
0
0
515
Jul ’23
AVFoundation with lidar and this year's RealityKit Object Capture.
With AVFoundation's builtInLiDARDepthCamera, if I save photo.fileDataRepresentation to heic, it only has Exif and TIFF metadata. But, RealityKit's object capture's heic image has not only Exif and TIFF, but also has HEIC metadata including camera calibration data. What should I do for AVFoundation's exported image has same meta data?
2
0
1k
Oct ’23