Object Capture

RSS for tag

Turn photos from your iPhone or iPad into high‑quality 3D models that are optimized for AR using the new Object Capture API on macOS Monterey.

Object Capture Documentation

Posts under Object Capture tag

66 Posts
Sort by:
Post not yet marked as solved
6 Replies
2.5k Views
I'm really excited about the Object Capture APIs being moved to iOS, and the complex UI shown in the WWDC session. I have a few unanswered questions: Where is the sample code available from? Are the new Object Capture APIs on iOS limited to certain devices? Can we capture images from the front facing cameras?
Posted
by
Post marked as solved
6 Replies
1.5k Views
I followed the instruction in this session and tried to write a demo with Object Capture API. But there's a MTLDebugRenderCommandEncoder assertion fails every time I call session.startDetecting(), just after the bounding box shows up. The error shows: -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5780: failed assertion `Draw Errors Validation Fragment Function(fsRealityPbr): the offset into the buffer clippingConstants that is bound at buffer index 8 must be a multiple of 256 but was set to 128. and my code is pretty simple: var body: some View { ZStack { ObjectCaptureView(session: session) if case .initializing = session.state { Button { session.start(imagesDirectory: getDocumentsDir().appendingPathComponent("Images/"), configuration: configuration) // 💥 } label: { Text("Prepare") } } else if case .ready = session.state { Button { session.startDetecting() } label: { Text("Continue") } } else if case .detecting = session.state { Button { session.startCapturing() } label: { Text("Start Capture") } } } } I'm wondering if anyone else is facing the same problem or if there's an error on my side?
Posted
by
Post not yet marked as solved
1 Replies
713 Views
Hello, I am really excited about the new Object Capture API for iOS. In the WWDC23 demo video, the user was rotating around the object. My question is does this API support taking photos from a fixed position (like on a tripod) and having a turntable to rotate the object? Another question, if the user has already taken some photos with depth and gravity, can the Object Capture use these images to construct the 3D model instead of taking new ones? Thank you.
Posted
by
Post marked as solved
1 Replies
802 Views
Hey! I'm trying to add this line to my project: var session = ObjectCaptureSession() But it automatically says "Cannot find 'ObjectCaptureSession' in scope" I cannot get this error to go away, so I haven't continued trying the snippets provided on your video. Is there anything else I need to install or configure before this is ready? I'm importing: import RealityKit import SwiftUI import Firebase In your video you mentioned that there is a sample code, but I can only find the snippets rather than a project. It's fine if there is no sample code, just that on the video you mentioned it, so it's confusing to not have the code, when it's mentioned so much on the video. Xcode 15- Beta iOS17 Sim installed MacOS Ventura 13.3.1 a
Posted
by
Post marked as solved
1 Replies
492 Views
After getting around some issues we finally got "something" to run. But we have 2 issues now getDocumentsDir() is not found on scope. I guess this function was only shown on the video, but not really implemented globally, now looking through some "eskimo" code I found docDir implementation. Which doesn't work either in this case. Because session.start() never runs, all I have is a Black Screen with an animation saying "Move iPad to start". QUESTION: A) How can I force this to work? I'm hoping to get capture the images to later process at a Mac. So I don't think I need the "checkpoint" part of configuration. Since getDocumentsDir() or docDir are not working, B) is there anyway to just select whatever folder we can, maybe hardcode it? Here is my code: import Foundation import RealityKit import SwiftUI // import Firebase struct CaptureView: View { let docDir = try! FileManager.default.url(for: .documentDirectory, in: .userDomainMask, appropriateFor: nil, create: true) var body: some View { ZStack { // Make the entire background black. Color.black.edgesIgnoringSafeArea(.all) if #available(iOS 17.0, *) { var session = ObjectCaptureSession() var configuration = ObjectCaptureSession.Configuration() // configuration.checkpointDirectory = getDocumentsDir().appendingPathComponent("Snapshots/") // session.start(imagesDirectory: docDir.appendingPathComponent("Images/"), configuration: configuration) ObjectCaptureView(session: session) } else { Text("Unsupported iOS 17 View") } } .environment(\.colorScheme, .dark) } }
Posted
by
Post not yet marked as solved
0 Replies
442 Views
What are the CPU scheduling macOS Monterey used for? First Come First Serve? Shortest Job First? Priority? Round robin? Multilevel queue? Multilevel feedback queue?
Posted
by
Post not yet marked as solved
0 Replies
629 Views
Hello, I'm a novice developer for iOS using SwiftUI. I want to create a way for users of my app to upload 3D models onto their account and dynamically load them into a scene upon selecting it from their library of saved 3D models. Is there a way to do this? I'm aware of the pass-through method, but this requires having models installed into the build of the app before launching it. Can someone help or point me in the right direction? Thank you!
Posted
by
Post not yet marked as solved
1 Replies
833 Views
With code below, I added color and depth image from RealityKit ARView, and ran Photogrammetry on iOS device, the mesh looks fine, but the scale of the mesh is quit different with real world scale. let color = arView.session.currentFrame!.capturedImage let depth = arView.session.currentFrame!.sceneDepth!.depthMap //😀 Color let colorCIImage = CIImage(cvPixelBuffer: color) let colorUIImage = UIImage(ciImage: colorCIImage) let depthCIImage = CIImage(cvPixelBuffer: depth) let heicData = colorUIImage.heicData()! let fileURL = imageDirectory!.appendingPathComponent("\(scanCount).heic") do { try heicData.write(to: fileURL) print("Successfully wrote image to \(fileURL)") } catch { print("Failed to write image to \(fileURL): \(error)") } //😀 Depth let context = CIContext() let colorSpace = CGColorSpace(name: CGColorSpace.linearGray)! let depthData = context.tiffRepresentation(of: depthCIImage, format: .Lf, colorSpace: colorSpace, options: [.disparityImage: depthCIImage]) let depth_dir = imageDirectory!.appendingPathComponent("IMG_\(scanCount)_depth.TIF") try! depthData!.write(to: depth_dir, options: [.atomic]) print("depth saved") And also tried this. let colorSpace = CGColorSpace(name: CGColorSpace.linearGray) let depthCIImage = CIImage(cvImageBuffer: depth, options: [.auxiliaryDepth : true]) let context = CIContext() let linearColorSpace = CGColorSpace(name: CGColorSpace.linearSRGB) guard let heicData = context.heifRepresentation(of: colorCIImage, format: .RGBA16, colorSpace: linearColorSpace!, options: [.depthImage : depthCIImage]) else { print("Failed to convert combined image into HEIC format") return } Does Anyone know why and how to fix this?
Posted
by
Post not yet marked as solved
3 Replies
980 Views
Now that we have the Vision Pro, I really want to start using Apple's Object Capture API to transform real objects into 3D assets. I watched the latest Object Capture vid from WWDC 23 and noticed they were using a "sample app". Does Apple provide this sample app to VisionOS developers or do we have to build our own iOS app? Thanks and cheers!
Posted
by
Post not yet marked as solved
2 Replies
545 Views
Hi there, Just wondering when the sample project will be available. I am having trouble getting anything good out of the snippets and want to see the workings of the full project. Where/When can we get this ?
Posted
by
Post not yet marked as solved
2 Replies
974 Views
With AVFoundation's builtInLiDARDepthCamera, if I save photo.fileDataRepresentation to heic, it only has Exif and TIFF metadata. But, RealityKit's object capture's heic image has not only Exif and TIFF, but also has HEIC metadata including camera calibration data. What should I do for AVFoundation's exported image has same meta data?
Posted
by
Post not yet marked as solved
0 Replies
478 Views
In WWDC 2021, It saids 'we also offer an interface for advanced workflows to provide a sequence of custom samples. A PhotogrammetrySample includes the image plus other optional data such as a depth map, gravity vector, or custom segmentation mask.' But in code, PhotogrammetrySession initialize with data saved directory. How can I give input of PhotogrammetrySamples to PhotogrammetrySession?
Posted
by
Post not yet marked as solved
1 Replies
809 Views
When I install and run the sample app Apple released just recently, everything works fine up until I try to start the capture. Bounding box sets up without a problem, but then every time, this error occurs: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '*** -[AVCapturePhotoOutput capturePhotoWithSettings:delegate:] You are not authorized to use custom shutter sounds' *** First throw call stack: (0x19d6e8300 0x195cd4f30 0x1b9bfdcb4 0x1cc4fbf98 0x1cc432964 0x19d6e8920 0x19d70552c 0x1cc4328f8 0x1cc4a8fac 0x19d6e8920 0x19d70552c 0x1cc4a8e44 0x23634923c 0x23637abfc 0x2362d21a4 0x2362d139c 0x236339874 0x23636dc04 0x1a67f9b74 0x1a68023ac 0x1a67fa964 0x1a67faa78 0x1a67fa5d0 0x1039c6b34 0x1039d80b4 0x1a6800188 0x1a67f94bc 0x1a67f9fd0 0x1a6800098 0x1a67f9504 0x23633777c 0x23637201c 0x2354d081c 0x2354c8658 0x1039c6b34 0x1039c9c20 0x1039e1078 0x1039dfacc 0x1039d6ebc 0x1039d6ba0 0x19d774e94 0x19d758594 0x19d75cda0 0x1df4c0224 0x19fbcd154 0x19fbccdb8 0x1a142f1a8 0x1a139df2c 0x1a1387c1c 0x102a5d944 0x102a5d9f4 0x1c030e4f8) libc++abi: terminating due to uncaught exception of type NSException I have no idea why this is happening, so any help would be appreciated. My iPad is running the latest iPadOS 17 Beta and the crash also occurs when I don't have it isn't connected to Xcode...
Posted
by
Post not yet marked as solved
1 Replies
572 Views
@Apple Really excited by the updates on the Reality Composer and Object Captures side - is it possible to publish the sample Apps and project material referenced in both the Reality Composer Pro sessions as well as the "Meet Object Capture for iOS" sessions. Looks like it hasn't yet been updated to the Sample Code site. Thank you! https://developer.apple.com/videos/play/wwdc2023/10273/ https://developer.apple.com/videos/play/wwdc2023/10191/
Posted
by