RoomPlan

RSS for tag

Create parametric 3D scans of rooms and room-defining objects.

RoomPlan Documentation

Posts under RoomPlan tag

91 Posts
Sort by:
Post not yet marked as solved
0 Replies
34 Views
Hello, I am working on an AR application to visualize a life-size room. I am working with Unity 2023.3, Apple ARKIT XR Plugin 6.0.0-pre.8 and a 2021 5th gen iPad. First I scan a room with roomplan to get a usdz file. I open it with Blender to make sure I have the right data (I do) and I export it to fbx to use it in Unity. Then I import the fbx to Unity and I use it as a prefab to instantiate it when I click on a detected floor. I build my application in Unity, then in Xcode to use it on my iPad. But when the room is displayed, it is way too small. I tried adding a slider to scale up the room's gameobject and I added a plugin to visualize my Unity scene in my built application. The room is scalling up in the Unity scene but not in the application. Does anyone ever had this issue and if so how did you fix that? Best regards, Angel Garcia
Posted Last updated
.
Post not yet marked as solved
0 Replies
54 Views
In larger scenes, I need to record motion trajectories. RoomCaptureSession always starts from (0,0,0), and I use the last tracked point as the offset value to connect multiple trajectory points, just like StructureBuilder merging models But when StructureBuilder merged, it eliminated some of the models, which would make the trajectory points I saved lose accuracy, and I cannot know how much scene size was specifically eliminated between them Is there any way you can help me?
Posted Last updated
.
Post not yet marked as solved
0 Replies
62 Views
invalidValue(-nan, Swift.EncodingError.Context(codingPath: [CapturedVolumeCodingKeys(stringValue: "rooms", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0), CapturedVolumeCodingKeys(stringValue: "openings", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0), CodingKeys(stringValue: "dimensions", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "Unable to encode Float.nan directly in JSON.", underlyingError: nil)) Why does this exception occur during encoding? All scan data is CapturedRoom and has not been modified
Posted Last updated
.
Post not yet marked as solved
0 Replies
70 Views
Hi Team Is there a way to extract a colorized scan as well with using the roomplan SDK ? . If yes, can you point me to the right reference link ? Does the roomplan SDK provide dimensions of the room ?
Posted
by domono.
Last updated
.
Post not yet marked as solved
1 Replies
101 Views
If my app utilized the RoomPlan api to create a parametric representation of the room, would it open on iPhones that don’t have lidars? I‘m aware the iPhone models that are equipped with lidar are iPhone 12 Pro & Pro Max, iPhone 13 Pro & Pro Max, iPhone 14 Pro & Pro Max, and iPhone 15 Pro & Pro Max.
Posted Last updated
.
Post not yet marked as solved
0 Replies
164 Views
I am trying to determine the corners of a RoomPlan-detected wall using the information available in the ARView session's frame, but can't quite figure out what I'm doing wrong. The corners appear to be correct relative to each other, but the wall appears too large when I render it. (I'm also not sure I'm handling the image rotation correctly either, which may be compounding my problem). Here is the code I currently have, along with a sample image, and the resulting image when I pass it through the perspective filter. it is close but isn't cropping the walls and floors correctly. func captureSession(_ session: RoomCaptureSession, didChange room: CapturedRoom) { for surface in room.walls { if let frame = self.arView.session.currentFrame { var image: CGImage? = nil VTCreateCGImageFromCVPixelBuffer(frame.capturedImage, options: nil, imageOut: &image) let wallTransform = surface.transform let cameraTransform = frame.camera.transform let intrinsics = frame.camera.intrinsics let projectionMatrix = frame.camera.projectionMatrix let width = surface.dimensions.y let height = surface.dimensions.x let inverseCameraTransform = simd_inverse(cameraTransform) let wallTopRight = simd_float4(width/2, height/2, 0, 1) let wallTopLeft = simd_float4(-width/2, height/2, 0, 1) let wallBottomRight = simd_float4(width/2, -height/2, 0, 1) let wallBottomLeft = simd_float4(-width/2, -height/2, 0, 1) let worldTopRight = wallTransform * wallTopRight let worldTopLeft = wallTransform * wallTopLeft let worldBottomRight = wallTransform * wallBottomRight let worldBottomLeft = wallTransform * wallBottomLeft let cameraTopRight = projectionMatrix * inverseCameraTransform * worldTopRight let cameraTopLeft = projectionMatrix * inverseCameraTransform * worldTopLeft let cameraBottomRight = projectionMatrix * inverseCameraTransform * worldBottomRight let cameraBottomLeft = projectionMatrix * inverseCameraTransform * worldBottomLeft let imageTopRight = intrinsics * simd_float3(cameraTopRight.x / cameraTopRight.w, cameraTopRight.y / cameraTopRight.w, cameraTopRight.z / cameraTopRight.w) let imageTopLeft = intrinsics * simd_float3(cameraTopLeft.x / cameraTopLeft.w, cameraTopLeft.y / cameraTopLeft.w, cameraTopLeft.z / cameraTopLeft.w) let imageBottomRight = intrinsics * simd_float3(cameraBottomRight.x / cameraBottomRight.w, cameraBottomRight.y / cameraBottomRight.w, cameraBottomRight.z / cameraBottomRight.w) let imageBottomLeft = intrinsics * simd_float3(cameraBottomLeft.x / cameraBottomLeft.w, cameraBottomLeft.y / cameraBottomLeft.w, cameraBottomLeft.z / cameraBottomLeft.w) let topRight = CGPoint(x: CGFloat(imageTopRight.x), y: CGFloat(imageTopRight.y)) let topLeft = CGPoint(x: CGFloat(imageTopLeft.x), y: CGFloat(imageTopLeft.y)) let bottomRight = CGPoint(x: CGFloat(imageBottomRight.x), y: CGFloat(imageBottomRight.y)) let bottomLeft = CGPoint(x: CGFloat(imageBottomLeft.x), y: CGFloat(imageBottomLeft.y)) if let image { let filter = CIFilter.perspectiveCorrection() filter.inputImage = CIImage(image: UIImage(cgImage: image)) filter.topRight = topRight filter.topLeft = topLeft filter.bottomRight = bottomRight filter.bottomLeft = bottomLeft let transformedImage = filter.outputImage if let transformedImage { let context = CIContext() if let outputImage = context.createCGImage(transformedImage, from: transformedImage.extent) { let wall = Wall(id: surface.identifier, image: outputImage, surface: surface) self.walls.append(wall) } } } } } }
Posted Last updated
.
Post not yet marked as solved
1 Replies
233 Views
I am using Lidar to measure the distance between the target point and the iPhone Pro. I am getting the correct distance only if I am greater than 70 cm away from the target point. I need that value to be accurate for distances below 70 cm as well. Is there any coding level issue or It's Lidar's limitations?
Posted
by Ramneet.
Last updated
.
Post not yet marked as solved
2 Replies
409 Views
I have the following issue regarding running 2 AR service. I am trying to develop an app for my masters thesis. Case 1: I first scan the room using the roomplan api. Then I stop the roomplan api session and start the realitykit session. When the realitykit session starts, the camera is not showing anything but black screen. Case 2: When I had the issue with case one, I tried a seperate test app where I had 2 seperate screen for roomplan api and realitykit. There is no relation. but as soon as I introduced roomplan api, realitykit stopped working, having the same black screen as above. There might be any states that changed by the roomplan api, that's why realitykit is not able to access the camera. Let me know if you have any idea about it or any sample. I am using the following stack: Xcode - Latest; Swiftui; latest os in mac mini and iphone
Posted
by shohandot.
Last updated
.
Post not yet marked as solved
0 Replies
166 Views
Following along with video from here https://developer.apple.com/videos/play/wwdc2022/10127/?time=410 At 6:50 mark we set up setup previewVisualizer, but we're not actually shown the implementation of this type. I think it would be helpful as I am having a hard time showing white visualizing lines that appear when scanning.
Posted
by D816Man.
Last updated
.
Post not yet marked as solved
1 Replies
215 Views
Hello Community, I'm encountering an issue with the latest iOS 17 update, specifically related to RoomPlan version-2. In iOS 16, when using RoomPlan version-1, we were able to display stairs in our app. However, after upgrading to iOS 17 and implementing RoomPlan version-2, the stairs are no longer visible. Despite thorough investigation, I couldn't find any option within the code to show or hide stairs, or any other objects for that matter. It seems like a specific issue with the update rather than a coding error on our part. Has anyone else encountered a similar problem? If so, I would greatly appreciate any insights or solutions you might have. It's crucial for our app functionality to have stairs displayed accurately, and we're currently at a loss on how to address this issue. Thank you in advance for any assistance you can provide. Best regards
Posted
by Ramneet.
Last updated
.
Post not yet marked as solved
2 Replies
373 Views
RoomPlan- Getting error for ceiling Height and floor Height from CapturedRoom. I am using RoomPlan version-2 Framework to scan houses. I need ceilingHeight and floorHeight. I am parsing the data of CapturedRoom successfully into Json and getting all keys and their value except "coreModel" of type: (RSFloorPlan) value. Does Apple support coreModel: RSFloorPlan yet? In "coreModel" key, there are some keys values of ceilingHeight and floorHeight. In the last I am getting encoded string as value of key "coreModel". CoreModel: qkgdxIDMaicOkpwW1I/h/V4C.........................7nCom3Lpu0CM= I will really appreciate your hard work.
Posted
by Ramneet.
Last updated
.
Post not yet marked as solved
0 Replies
334 Views
I am making an app with RoomPlan using the official sample code. A model of the room is generated as it is scanned, and when the scan is complete, chairs and other objects are aligned parallel to the desk. I wanted to stop this behavior, so I decided to use the option beautifyObjects. import UIKit import RoomPlan class RoomCaptureViewController: UIViewController, RoomCaptureViewDelegate, RoomCaptureSessionDelegate { @IBOutlet var exportButton: UIButton? @IBOutlet var doneButton: UIBarButtonItem? @IBOutlet var cancelButton: UIBarButtonItem? @IBOutlet var activityIndicator: UIActivityIndicatorView? private var isScanning: Bool = false private var roomCaptureView: RoomCaptureView! private var roomCaptureSessionConfig: RoomCaptureSession.Configuration = RoomCaptureSession.Configuration() private var roomBuilder: RoomBuilder! private var processedResult: CapturedRoom? private var finalResults: CapturedRoom? override func viewDidLoad() { super.viewDidLoad() // Set up after loading the view. setupRoomBuilder() setupRoomCaptureView() activityIndicator?.stopAnimating() } private func setupRoomBuilder() { let beautifyObjectsEnabled = UserDefaults.standard.bool(forKey: "beautifyObjectsEnabled") if beautifyObjectsEnabled { roomBuilder = RoomBuilder(options: [.beautifyObjects]) } else { roomBuilder = RoomBuilder(options: []) } } private func setupRoomCaptureView() { roomCaptureView = RoomCaptureView(frame: view.bounds) roomCaptureView.captureSession.delegate = self roomCaptureView.delegate = self view.insertSubview(roomCaptureView, at: 0) } override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) startSession() } override func viewWillDisappear(_ flag: Bool) { super.viewWillDisappear(flag) stopSession() } private func startSession() { isScanning = true roomCaptureView?.captureSession.run(configuration: roomCaptureSessionConfig) setActiveNavBar() } private func stopSession() { isScanning = false roomCaptureView?.captureSession.stop() setCompleteNavBar() } // Decide to post-process and show the final results. func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) -> Bool { Task { do { let capturedRoom = try await roomBuilder.capturedRoom(from: roomDataForProcessing) DispatchQueue.main.async { self.finalResults = capturedRoom self.exportButton?.isEnabled = true self.activityIndicator?.stopAnimating() } } catch { print("Error processing room data: \(error.localizedDescription)") } } return true } // Access the final post-processed results. func captureView(didPresent processedResult: CapturedRoom, error: Error?) { finalResults = processedResult self.exportButton?.isEnabled = true self.activityIndicator?.stopAnimating() } @IBAction func doneScanning(_ sender: UIBarButtonItem) { if isScanning { stopSession() } else { cancelScanning(sender) } self.exportButton?.isEnabled = false self.activityIndicator?.startAnimating() } @IBAction func cancelScanning(_ sender: UIBarButtonItem) { navigationController?.dismiss(animated: true) } // Export the USDZ output by specifying the `.parametric` export option. // Alternatively, `.mesh` exports a nonparametric file and `.all` // exports both in a single USDZ. @IBAction func exportResults(_ sender: UIButton) { let destinationFolderURL = FileManager.default.temporaryDirectory.appending(path: "Export") let destinationURL = destinationFolderURL.appending(path: "Room.usdz") let capturedRoomURL = destinationFolderURL.appending(path: "Room.json") do { try FileManager.default.createDirectory(at: destinationFolderURL, withIntermediateDirectories: true) let jsonEncoder = JSONEncoder() let jsonData = try jsonEncoder.encode(finalResults) try jsonData.write(to: capturedRoomURL) try finalResults?.export(to: destinationURL, exportOptions: .parametric) let activityVC = UIActivityViewController(activityItems: [destinationFolderURL], applicationActivities: nil) activityVC.modalPresentationStyle = .popover present(activityVC, animated: true, completion: nil) if let popOver = activityVC.popoverPresentationController { popOver.sourceView = self.exportButton } } catch { print("Error = \(error)") } } private func setActiveNavBar() { UIView.animate(withDuration: 1.0, animations: { self.cancelButton?.tintColor = .white self.doneButton?.tintColor = .white self.exportButton?.alpha = 0.0 }, completion: { complete in self.exportButton?.isHidden = true }) } private func setCompleteNavBar() { self.exportButton?.isHidden = false UIView.animate(withDuration: 1.0) { self.cancelButton?.tintColor = .systemBlue self.doneButton?.tintColor = .systemBlue self.exportButton?.alpha = 1.0 } } } The func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) is mainly changed. I have confirmed in the debugger that the roomBuilder options are changed according to the buttons in the UI. Anyone who knows more about the behavior of this option, please give me an advice.
Posted
by mqcmd196.
Last updated
.
Post not yet marked as solved
0 Replies
218 Views
Hello, I used to be able to encode CapturedRoomData using JSONEncoder with the code below: // Encode CapturedRoomData func encodeRoomData(_ roomData: CapturedRoomData) -> Data? { print("#ECRD1 - Data: \(roomData)") do { let encodedRoom = try JSONEncoder().encode(roomData) print("#ECRD2 - Encoded: \(encodedRoom.description)") return encodedRoom } catch { print("#ECRD3 - Failed with error: \(error)") } return nil } A few weeks ago I noticed this approach is no longer working. The encoding fails and I get the following error printed: #ECRD3 - Failed with error: invalidValue(RoomPlan.CapturedRoomData, Swift.EncodingError.Context(codingPath: [], debugDescription: "Invalid data", underlyingError: nil)) Can anyone help me find the root of this problem? For reference, here’s what the printed CapturedRoomData looks like (with the keyframes omitted): #ECRD1 - Data: CapturedRoomData(keyframes: [...], coreAsset: <RSAsset: 0x283988bd0>, arFrameReferenceOriginTransform: simd_float4x4([[0.9995456, 0.0, 0.030147359, 0.0], [0.0, 1.0, 0.0, 0.0], [-0.030147359, 0.0, 0.9995456, 0.0], [0.38664898, 0.93699455, 0.38685757, 1.0]]))
Posted
by Mortekay.
Last updated
.
Post not yet marked as solved
1 Replies
232 Views
start the session and took a scan everything looks perfect in the 3d while taking but after completing the scan, the 3d model that is presented on the screen is getting morphed within 4 sec gap . first it shows the correct one after 4 to 5 sec the usdz is getting morphed like all the straight walls are transforming into curved and if i have one walls getting some 2 to 3 walls with overlapping. not able to know why it is getting morphed like that. All I do was presenting the 3d model on the screen from the delegate method should Present - func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) -> Bool { print("shouldPresentdelegate") if let error = error { print("shouldPresent error::",error) } return true } first it shows correct next in 4 to 5 sec its again reloads to another 3d form which is not correct at all . is there any reason behind this. can anyone please help me with this. your help is appreciated. Thankyou.
Posted Last updated
.
Post not yet marked as solved
1 Replies
229 Views
Hello Everyone, I want to know whether roomplan gives us the complete info about the furniture sizing and alignment like how the object is being positioned in the room (wither forward/backward like that).
Posted Last updated
.
Post not yet marked as solved
1 Replies
254 Views
I'm trying to decipher this roomplan api data output. Dimensions: the simd3 is width and height and I think thickness in meters but the spacial information is not clear to me and I can't find any info on it. an insights on this? *** CAPTURED ROOM OBJECTS *** Walls: Identifier: 9F4BEEBB-990C-42F3-A0BC-912E1F770877 Dimensions: SIMD3(5.800479, 2.4299998, 0.0) Category: wall Transform: SIMD4(0.25821668, 0.0, 0.966087, 0.0) SIMD4(0.0, 1.0, 0.0, 0.0) SIMD4(-0.966087, 0.0, 0.2582167, 0.0) SIMD4(2.463065, -0.27346277, -0.5366996, 1.0) Identifier: A72544F5-068D-4F19-8FA4-60C1331003E3 Dimensions: SIMD3(2.28993, 2.4299998, 0.0) Category: wall Transform: SIMD4(0.966087, 0.0, -0.2582166, 0.0) SIMD4(0.0, 1.0, 0.0, 0.0) SIMD4(0.25821656, 0.0, 0.966087, 0.0) SIMD4(0.608039, -0.27346277, -3.0429342, 1.0)
Posted Last updated
.
Post not yet marked as solved
0 Replies
222 Views
It has been awhile since I looked at RoomPlan. I noticed when I was poking at the CapturedRoom, that there is binary data called coreModel... Does anyone know what this is? Is this related to ModelProvider? Thanks in advance.
Posted
by bambamyi.
Last updated
.
Post not yet marked as solved
1 Replies
345 Views
I'm using RoomPlan to create a 2-d layout of a room. Because it's just a single room, I want to present it 'squared' up, see my picture First off, is there any options in the ARView that was used with the scan to sort of zero out the orientation so that it creates a room layout that is not on an angle? If not... Is there a well known mathematical formula to snap this to the closest 90 degrees? I tried just changing the rotation of each wall to snap to the nearest 90 degrees, but it wasn't quite good enough as they no longer joined together cleanly. Perhaps there's an algorithm for doing this while preserving the relative locations of everything?
Posted
by kditrag.
Last updated
.