RoomPlan

RSS for tag

Create parametric 3D scans of rooms and room-defining objects.

Posts under RoomPlan tag

69 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

RoomCaptureSession persistence, ARSession pause broken?
Hi all, Our app allows a user to scan a room and then save that scan on a separate view, followed by additional scans. We're looking into allowing room combining via CapturedStructure, so we need rooms to be scanned in the same ARWorldMap without necessarily needing to re-localize in the same session. This should fit within the first scenario that Apple described. The only way I have found that allows our requirements is to save RoomCaptureView and to re-use that RoomCaptureView whenever we need to start a session again. This creates a number of other issues, and ideally, we wouldn't need to save a View in something like a singleton. We are using captureSession.stop(pauseARSession: false). Additionally, if we use the same RoomCaptureView and an error occurs during the scanning process, we can't get the instructions overlay to appear again if we reuse this view (specifically, the instructions in the middle of the view that state "Move device to start"). It's as if the instructions are completely removed and scanning is stuck on an error state if an error occurs. These instructions also seem to be separate from the instructions we can grab from RoomCaptureViewDelegate via didProvide instruction: RoomCaptureSession.Instruction), so we can't use that either. There's a couple subviews that seem relevant to this: RoomCaptureCoachingOverlayView and ARGlyphView - but both are not public, so we can't force them to appear. Also attempted a number of other things to try to get these subviews to appear, such as layoutIfNeeded(). Saving the ARSession and using it in let roomCaptureView = RoomCaptureView(frame: viewBounds, arSession: arSession) where we're creating a new view with the same ARSession seems much more ideal as that solves the above issues, but we run into another issue: world tracking seems to be completely lost when a new RoomCaptureView (and thus a new RoomCaptureSession) is started, even with the same already started ARSession, almost as if captureSession.stop(pauseARSession: false) doesn't work as described. Is there any way around needing to use the same RoomCaptureView or RoomCaptureSession for subsequent scans in the same session without needing to re-localize via ARWorldMap loading? Is there a way to force the guiding instructions to appear?
0
2
50
12h
Error: captureSession(_:didEndWith:error:) nearly matches defaulted requirement in RoomCaptureSessionDelegate
Hello, I’m working with the RoomPlan API to capture and export 3D room models in an iOS app. My goal is to implement the RoomCaptureSessionDelegate protocol to handle the end of a room capture session. However, I’m encountering a cautionary warning in Xcode: Warning: "captureSession(:didEndWith:error:) nearly matches defaulted requirement captureSession(:didEndWith:error:) of protocol RoomCaptureSessionDelegate." I’ve verified that my delegate method signature matches the protocol, but the warning persists. I suspect this might be due to minor discrepancies in the parameter types or naming conventions required by the protocol. Below is my full RoomScanner.swift file: import RoomPlan import SwiftUI import UIKit func exportModelToFiles() { let exportURL = FileManager.default.temporaryDirectory.appendingPathComponent("RoomModel.usdz") let documentPicker = UIDocumentPickerViewController(forExporting: [exportURL]) documentPicker.modalPresentationStyle = .formSheet if let windowScene = UIApplication.shared.connectedScenes.first as? UIWindowScene, let rootViewController = windowScene.windows.first?.rootViewController { rootViewController.present(documentPicker, animated: true, completion: nil) } } class RoomScanner: ObservableObject { var roomCaptureSession: RoomCaptureSession? private let sessionConfig = RoomCaptureSession.Configuration() private var capturedRoom: CapturedRoom? func startSession() { roomCaptureSession = RoomCaptureSession() roomCaptureSession?.delegate = self roomCaptureSession?.run(configuration: sessionConfig) } func stopSession() { roomCaptureSession?.stop() guard let room = capturedRoom else { print("No room data available for export.") return } let exportURL = FileManager.default.temporaryDirectory.appendingPathComponent("RoomModel.usdz") do { try room.export(to: exportURL) print("3D floor plan exported to \(exportURL)") } catch { print("Error exporting room model: \(error)") } } } // MARK: - RoomCaptureSessionDelegate extension RoomScanner: RoomCaptureSessionDelegate { func captureSession(_ session: RoomCaptureSession, didUpdate room: CapturedRoom) { // Handle real-time updates if necessary } func captureSession(_ session: RoomCaptureSession, didEndWith capturedRoom: CapturedRoom, error: Error?) { if let error = error { print("Capture session ended with error: \(error.localizedDescription)") } else { self.capturedRoom = capturedRoom } } }
1
0
187
1w
Map captured rooms from before merge, to new merged captured rooms
Hello everyone, I'm developing an app using RoomPlan where users can capture several rooms individually and assign custom names to each room. When I merge these captured rooms into a single CapturedStructure using the StructureBuilder, I want the merged structure to retain the original sections with their custom labels. However, I'm encountering an issue: Problem: After merging, the CapturedStructure's sections have different identifiers (UUIDs) compared to the original sections in the individual CapturedRooms. This means there's no straightforward way to map the custom labels from the original rooms to the sections in the merged structure. What I've Tried: Mapping by Identifiers: Attempted to map custom labels using the sections' identifiers, but the identifiers change during the merge, making this ineffective. Mapping by Positions: Tried matching sections based on their center positions (simd_float3), but these change. Modifying Section Labels: Considered changing the label property of the sections before merging, but the label is read-only and cannot be modified directly, and the CoreModel is used for merging rather than the data that can be modified via json. Question: Is there a recommended way to preserve or map custom section labels when merging multiple CapturedRooms into a single CapturedStructure using RoomPlan? How can I ensure that the custom labels assigned to rooms before the merge are correctly associated with the corresponding sections after the merge? Any guidance or suggestions would be greatly appreciated! Thank you!
0
0
289
Oct ’24
What does setWorldOrigin() do?
I stumbled across the function setWorldOrigin(relativeTransform:) from the ARSession which is documented here: https://developer.apple.com/documentation/arkit/arsession/2942278-setworldorigin I made a custom ARSession where i override this function and print and modify the relativeTransform parameter. The print shows that this function is called with an updated relativeTransform value but it seems that it has no impact e.g. on the world origin when starting or continuing a scan, the tiny puppet house in RoomPlan or any tracking position that i get from ARKit. Has anybody experience with this method or knows what parts are influenced by setWorldOrigin()?
0
0
225
Sep ’24
converting .usdz to .obj results in flattened model
I'm using the Apple RoomPlan sdk to generate a .usdz file, which works fine, and gives me a 3D scan of my room. But when I try to use Model I/O's MDLAsset to convert that output into an .obj file, it comes out as a completely flat model shaped like a rectangle. Here is my Swift code: let destinationURL = destinationFolderURL.appending(path: "Room.usdz") do { try FileManager.default.createDirectory(at: destinationFolderURL, withIntermediateDirectories: true) try finalResults?.export(to: destinationURL, exportOptions: .model) let newUsdz = destinationURL; let asset = MDLAsset(url: newUsdz); let obj = destinationFolderURL.appending(path: "Room.obj") try asset.export(to: obj) } Not sure what's wrong here. According to MDLAsset documentation, .obj is a supported format and exporting from .usdz to the other formats like .stl and .ply works fine and retains the original 3D shape. Some things I've tried: changing "exportOptions" to parametric, mesh, or model. simply changing the file extension of "destinationURL" (throws error)
0
0
415
Sep ’24
Recording video and using RoomPlan at the same time
Hi, from the 2023 WWDC video on RoomPlan, they mention that it should be possible to integrate photo / video with RoomPlan: https://developer.apple.com/videos/play/wwdc2023/10192/ (at ~2:30) However, when I attempt to use AVFoundation and AVCaptureSession with RoomPlan, I get the simple error of "Cannot Record". So I'm not sure if there is something wrong with my setup/code, or if these two libraries are actually incompatible. Are there any kinds of guides for doing things like this? Am I going in the right direction or should I try a different approach? Happy to share code if necessary. Thanks
2
0
385
Sep ’24
Multiple floor levels in one story
In lots of houses there are different levels but are still on the same floor. What i mean is that there are things like stairs on the entrance that only have a few steps and would count basically as the same story. RoomPlan already does a nice job recognizing them during the scanning but after the StructureBuilder or the optimization step it is not really satisfying. Has anyone managed to handle those cases? Or do you have to scan a specific way to capture such small differences within a level?
0
0
248
Sep ’24
RoomCaptureSession custom ARSession missing SceneDepth
Hello We are exploring the iOS 17 RoomPlan updates that allow for a custom ARSession to be passed into the RoomCaptureSession via the new initializer. let roomCaptureSession = RoomCaptureSession(arSession: myARSession) Currently we use our ARSession to extract sceneDepth from the ARFrames via the delegate callback. This works prior to activation of the RoomCaptureSession via session.run(configuration). However, when we do call run on the RoomCaptureSession, sceneDepth is no longer present on the incoming ARFrames. Are these mutually exclusive? Should we expect ARFrame depth data to be present when a RoomCaptureSession is running with the shared ARSession?
1
0
360
Sep ’24
Modify CapturedRoom Objects
The RoomPlan API makes it possible to serialize and de-serialize CapturedRoom objects. This opens up the possibility to modify a CapturedRoom (e.g. deleting surfaces/objects) in a de-serialized state and serialize it as a new CapturedRoom. All modified attributes are loaded accordingly, so far so good. My problem starts with the StructureBuilder and it's merge function capturedStructure(). This function ignores any modifications to attributes of a CapturedRoom. The only data that is considered is encoded in the CoreModel attribute (which is not mentioned in the official documentation). If someone has more information or a working solution about how to modify CapturedRooms please let me know. Additionally if there is somewhere a documentation about the CoreModel-attribute please post a link here.
0
0
354
Sep ’24
Issue with losing alignment on multi-room session
Hello! We're having this issue in our app that is implementing multi room scan via RoomPlan, where the ARSession world origin is shifted to wherever the RoomCaptureSession is ran again (e.g in the next room) To clarify a few point We are using the RoomCaptureView, starting a new room using roomCaptureView.captureSession.run(configuration: captureSessionConfig) and stopping the room scan via roomCaptureView.captureSession.stop(pauseARSession: false) We are re-using the same ARSession and, which is passed into the RoomCaptureView as so: arSession = ARSession() roomCaptureView = RoomCaptureView(frame: .zero, arSession: arSession) Any clue why the AR world origin is reset? I need it to be consistent for storing frame camera position Thanks!
0
1
225
Sep ’24
RoomPlan: textures for walls and floors.
After scanning the room I use the .export method passing a ModelProvider. Then I import the USDZ into a SCNView and continue processing the scene: I would like to apply a texture to the walls and floor, but I can't do it for the walls and floor because they don't contain the TextureCoordinates. Creating them from the USDZ file is not easy. I tried to combine the CapturedRoom data for the walls and floor only, adding the TextureCoordinates. I'm managing, but I'm struggling a lot. Isn't there an easier way to do it? Is there a ModelProvider planned for surfaces in the future? If so, where can I access the RoomPlan beta documentation?
1
1
378
Sep ’24
Add basic story ceilings to RoomPlan model
The structure builder provides walls and floors for each captured story, but not a ceiling. For my case it is necessary that the scanned geometry is closed to open up the possibility to place objects on the ceiling for example and therefore it is important that there is an estimated ceiling for different rooms within a story. Is there any info that apple has something like this on the roadmap in the future because i think this can open opportunities especially when thinking about industrial application of the API. If somebody has more insights on this topic pls share :)
1
0
462
Aug ’24
RoomPlan AR Frame Position wrong after combining Rooms
Hello everyone, I am struggling to find a solution for the following problem, and I would be glad and thankful if anyone can help me. My Use Case: I am using RoomPlan to scan a room. While scanning, there is a function to take pictures. The position from where the pictures are taken will be saved (in my app, they are called "points of interest" = POI). This works fine for a single room, but when adding a new room and combining the two of them using: structureBuilder.capturedStructure(from: capturedRooms) the first room will be transformed and thus moved around to fit in the world space. The points are not transformed with the rest of the room since they are not in the rooms structure specifically, which is fine, but how can I transform the POIs too, so that they are in the correct positions where they were taken? I used: func captureSession(_ session: RoomCaptureSession, didEndWith data: CapturedRoomData, error: (Error)?) to get the transform matrix from "arFrameReferenceOriginTransform" and apply this to the POIs, but it still seems that this is not enough. I would be happy for any tips and help! Thanks in advance! My Update function: func updatePOIPositions(with originTransform: simd_float4x4) { for i in 0..<(poisOldRooms.count) { var poi = poisOldRooms[i] let originalPosition = SIMD4<Float>( poi.data.cameraOriginX, poi.data.cameraOriginY, poi.data.cameraOriginZ, 1.0 ) let updatedTransform = originTransform * originalPosition poisOldRooms[i].data.cameraX = updatedTransform.x poisOldRooms[i].data.cameraY = updatedTransform.y poisOldRooms[i].data.cameraZ = updatedTransform.z } }
3
2
515
Sep ’24
Roomplan + Object Capture
We have an issue with Apple Roomplan - on regular bases the objects which are captured are not positioned corretly in the model which happens 50% of the cases we have - that makes the feature almost useless. Is there any idea how to solve that problem?
0
0
366
Jul ’24
Floor stability with physics simulations
In RealityKit using visionOS, I scan the room and use the resulting mesh to create occlusion and physical boundaries. That works well and iI can place cubes (with physics on) onto that too. However, I also want to update the mesh with versions from new scans and that make all my cubes jump. Is there a way to prevent this? I get that the inaccuracies will produce slightly different mesh and I don’t want to anchor the objects so my guess is I need to somehow determine a fixed floor height and alter the scanned meshes so they adhere that fixed height. Any thoughts or ideas appreciated /Andreas
1
0
529
Jul ’24