So in the WWDC23 video on the Roomplan enhancement, it says that it is now possible to set a custom ARSession for the RoomCaptureSession. But how do you actually set the config for the custom ARSession?
init() {
let arConfig = ARWorldTrackingConfiguration()
arConfig.worldAlignment = .gravityAndHeading
arSession = ARSession()
roomCaptureView = RoomCaptureView(frame: CGRect(x: 0, y: 0, width: 42, height: 42), arSession: arSession)
sessionConfig = RoomCaptureSession.Configuration()
roomCaptureView.captureSession.delegate = self
roomCaptureView.delegate = self
}
However, I keep getting an issue that self is being used in the property access before being initialised.
What can I do to fix it?
RoomPlan
RSS for tagCreate parametric 3D scans of rooms and room-defining objects.
Posts under RoomPlan tag
69 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Is there a way to access the coordinates of where the camera is while scanning the room with Roomplan?
Hello, I am trying to make an app that involves room scanning and then placing of imaginary objects in the room. I had two questions about the specifics behind this.
Is it possible for Roomplan to include the ceiling when scanning the room?
Is it possible to place objects in AR while Room plan is running, or is it necessary to wait until after the scan is done?
Is it possible to access the RoomPlan API from Objective-C? I cannot figure out how to include the RoomPlan framework into some legacy Objective-C code I have. I can include the RoomPlan.h header but it still does not recognize any of the API classes. I also could not figure out if there was a way to use RoomPlan-Swift.h to expose the API to the Objective-C code.
Since moving up to IOS18 lat week I am getting an indication that there was a significant drop in IMU data being sent. Using the search capability, I can find very little information in the Developer Documentation that will tell me what the cause is and how to remedy it. Is there some documentation repository like Tech Notes that will tell me what I need to know to get going again? What additional sources of documentation are available for developers? The Search engine used for Developer documentation just does not cut it because it delivered a lot of useless entries that have no obvious relevance to my search terms.
"2024-06-20 19:27:00.669334-0500 RoomPlanExampleApp[902:299709] [Technique] ARWorldTrackingTechnique <0x104bf6d80>: SLAM error callback: Error Domain=Slam Error Code=7 "Non fatal error occurred due to significant drop in a IMU data" UserInfo={NSDescription=Non fatal error occurred due to significant drop in a IMU data, NSLocalizedFailureReason=SlamEngineNodeGroup Failure: IMU issue: gyro data stream verification failed [Significant data drop]. Failed on timestamp: 53902.785827, Last known timestamp: 53901.416828, Delta: 1.368999, System timestamp: 53902.786251, Delta between system and frame: 0.000423. }"
This is one of the files being looked for during initialization of the RoomPlan WWDC Demo package but it cannot be found since moving to IOS 18.0. it is not anyrhere since the upgrade.
Reference is 2024-06-18 16:03:36.871062-0500 RoomPlanExampleApp[860:159744] [loading] Unable to create bundle at URL (file:///System/Library/CoreServices/SystemVersion.bundle): does not exist or not a directory (0)
My goal is to modify CapturedRooms and load them back into the StructureBuilder to generate a new CapturedStructure.
Since CapturedRooms cannot be modified directly I stored them as JSON, modified the parameters (e.g. switching object categories) and serialized them back into a CapturedRoom object. So far so good, the object is loaded correctly. But when i put them into the capturedStructure() all the original parts of the CapturedRoom are used.
As some of you may have already noticed there is an undocumented CoreModel stored in CapturedRooms when you export them in JSON-format. It seems that the structure builder only uses this CoreModel to compose the output.
So here my question to the forum:
Does anybody know a way to edit a CapturedRoom so the StructureBuilder respects those changes and composes a new structure including those changes?
Hello! I want to create an indoor mapping application in Swift, using the LiDAR scanner. I searched among frameworks and I found that ARKit, RealityKit and RoomPlan would be useful. Which is the proper way to create a 2D indoor mapping app? And which is the proper way to create a 3D indoor mapping app? Are there any modifications I have to make on my code in order to have both?
HI there,
I would like for the user to be able to tap on a wall that has been highlighted as scanned (the white outline) and see basic information about the wall (in a pop up view modal) without being taken out of the scan session.
As a first step though i'd simply like to be able to tap on the scanned wall whilst still in the session and see in the NSLog, the data about that CapturedRoom.Surface.
I'm storing the CapturedRoom on update of the sesssion using the RoomCaptureSessionDelegate and I have added a UITapGestureRecognizer to the room capture view.
However i've tried a number of ways (hit testing, raycasting) and i'm unable to target the wall behind the users tap gesture.
If anyone can give any advice even if just the principal of how to achieve this.
Hello,
I am working on an AR application to visualize a life-size room. I am working with Unity 2023.3, Apple ARKIT XR Plugin 6.0.0-pre.8 and a 2021 5th gen iPad.
First I scan a room with roomplan to get a usdz file. I open it with Blender to make sure I have the right data (I do) and I export it to fbx to use it in Unity.
Then I import the fbx to Unity and I use it as a prefab to instantiate it when I click on a detected floor.
I build my application in Unity, then in Xcode to use it on my iPad. But when the room is displayed, it is way too small.
I tried adding a slider to scale up the room's gameobject and I added a plugin to visualize my Unity scene in my built application. The room is scalling up in the Unity scene but not in the application.
Does anyone ever had this issue and if so how did you fix that?
Best regards,
Angel Garcia
In larger scenes, I need to record motion trajectories. RoomCaptureSession always starts from (0,0,0), and I use the last tracked point as the offset value to connect multiple trajectory points, just like StructureBuilder merging models
But when StructureBuilder merged, it eliminated some of the models, which would make the trajectory points I saved lose accuracy, and I cannot know how much scene size was specifically eliminated between them
Is there any way you can help me?
invalidValue(-nan, Swift.EncodingError.Context(codingPath: [CapturedVolumeCodingKeys(stringValue: "rooms", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0), CapturedVolumeCodingKeys(stringValue: "openings", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0), CodingKeys(stringValue: "dimensions", intValue: nil), _JSONKey(stringValue: "Index 0", intValue: 0)], debugDescription: "Unable to encode Float.nan directly in JSON.", underlyingError: nil))
Why does this exception occur during encoding? All scan data is CapturedRoom and has not been modified
Hi Team
Is there a way to extract a colorized scan as well with using the roomplan SDK ? . If yes, can you point me to the right reference link ?
Does the roomplan SDK provide dimensions of the room ?
If my app utilized the RoomPlan api to create a parametric representation of the room, would it open on iPhones that don’t have lidars? I‘m aware the iPhone models that are equipped with lidar are iPhone 12 Pro & Pro Max, iPhone 13 Pro & Pro Max, iPhone 14 Pro & Pro Max, and iPhone 15 Pro & Pro Max.
I am trying to determine the corners of a RoomPlan-detected wall using the information available in the ARView session's frame, but can't quite figure out what I'm doing wrong. The corners appear to be correct relative to each other, but the wall appears too large when I render it. (I'm also not sure I'm handling the image rotation correctly either, which may be compounding my problem). Here is the code I currently have, along with a sample image, and the resulting image when I pass it through the perspective filter. it is close but isn't cropping the walls and floors correctly.
func captureSession(_ session: RoomCaptureSession, didChange room: CapturedRoom) {
for surface in room.walls {
if let frame = self.arView.session.currentFrame {
var image: CGImage? = nil
VTCreateCGImageFromCVPixelBuffer(frame.capturedImage, options: nil, imageOut: &image)
let wallTransform = surface.transform
let cameraTransform = frame.camera.transform
let intrinsics = frame.camera.intrinsics
let projectionMatrix = frame.camera.projectionMatrix
let width = surface.dimensions.y
let height = surface.dimensions.x
let inverseCameraTransform = simd_inverse(cameraTransform)
let wallTopRight = simd_float4(width/2, height/2, 0, 1)
let wallTopLeft = simd_float4(-width/2, height/2, 0, 1)
let wallBottomRight = simd_float4(width/2, -height/2, 0, 1)
let wallBottomLeft = simd_float4(-width/2, -height/2, 0, 1)
let worldTopRight = wallTransform * wallTopRight
let worldTopLeft = wallTransform * wallTopLeft
let worldBottomRight = wallTransform * wallBottomRight
let worldBottomLeft = wallTransform * wallBottomLeft
let cameraTopRight = projectionMatrix * inverseCameraTransform * worldTopRight
let cameraTopLeft = projectionMatrix * inverseCameraTransform * worldTopLeft
let cameraBottomRight = projectionMatrix * inverseCameraTransform * worldBottomRight
let cameraBottomLeft = projectionMatrix * inverseCameraTransform * worldBottomLeft
let imageTopRight = intrinsics * simd_float3(cameraTopRight.x / cameraTopRight.w, cameraTopRight.y / cameraTopRight.w, cameraTopRight.z / cameraTopRight.w)
let imageTopLeft = intrinsics * simd_float3(cameraTopLeft.x / cameraTopLeft.w, cameraTopLeft.y / cameraTopLeft.w, cameraTopLeft.z / cameraTopLeft.w)
let imageBottomRight = intrinsics * simd_float3(cameraBottomRight.x / cameraBottomRight.w, cameraBottomRight.y / cameraBottomRight.w, cameraBottomRight.z / cameraBottomRight.w)
let imageBottomLeft = intrinsics * simd_float3(cameraBottomLeft.x / cameraBottomLeft.w, cameraBottomLeft.y / cameraBottomLeft.w, cameraBottomLeft.z / cameraBottomLeft.w)
let topRight = CGPoint(x: CGFloat(imageTopRight.x), y: CGFloat(imageTopRight.y))
let topLeft = CGPoint(x: CGFloat(imageTopLeft.x), y: CGFloat(imageTopLeft.y))
let bottomRight = CGPoint(x: CGFloat(imageBottomRight.x), y: CGFloat(imageBottomRight.y))
let bottomLeft = CGPoint(x: CGFloat(imageBottomLeft.x), y: CGFloat(imageBottomLeft.y))
if let image {
let filter = CIFilter.perspectiveCorrection()
filter.inputImage = CIImage(image: UIImage(cgImage: image))
filter.topRight = topRight
filter.topLeft = topLeft
filter.bottomRight = bottomRight
filter.bottomLeft = bottomLeft
let transformedImage = filter.outputImage
if let transformedImage {
let context = CIContext()
if let outputImage = context.createCGImage(transformedImage, from: transformedImage.extent) {
let wall = Wall(id: surface.identifier, image: outputImage, surface: surface)
self.walls.append(wall)
}
}
}
}
}
}
I am using Lidar to measure the distance between the target point and the iPhone Pro. I am getting the correct distance only if I am greater than 70 cm away from the target point. I need that value to be accurate for distances below 70 cm as well.
Is there any coding level issue or It's Lidar's limitations?
I have the following issue regarding running 2 AR service. I am trying to develop an app for my masters thesis.
Case 1: I first scan the room using the roomplan api. Then I stop the roomplan api session and start the realitykit session. When the realitykit session starts, the camera is not showing anything but black screen.
Case 2: When I had the issue with case one, I tried a seperate test app where I had 2 seperate screen for roomplan api and realitykit. There is no relation. but as soon as I introduced roomplan api, realitykit stopped working, having the same black screen as above.
There might be any states that changed by the roomplan api, that's why realitykit is not able to access the camera. Let me know if you have any idea about it or any sample.
I am using the following stack:
Xcode - Latest; Swiftui; latest os in mac mini and iphone
Following along with video from here https://developer.apple.com/videos/play/wwdc2022/10127/?time=410
At 6:50 mark we set up setup previewVisualizer, but we're not actually shown the implementation of this type. I think it would be helpful as I am having a hard time showing white visualizing lines that appear when scanning.
Hello Community,
I'm encountering an issue with the latest iOS 17 update, specifically related to RoomPlan version-2. In iOS 16, when using RoomPlan version-1, we were able to display stairs in our app. However, after upgrading to iOS 17 and implementing RoomPlan version-2, the stairs are no longer visible.
Despite thorough investigation, I couldn't find any option within the code to show or hide stairs, or any other objects for that matter. It seems like a specific issue with the update rather than a coding error on our part.
Has anyone else encountered a similar problem? If so, I would greatly appreciate any insights or solutions you might have. It's crucial for our app functionality to have stairs displayed accurately, and we're currently at a loss on how to address this issue.
Thank you in advance for any assistance you can provide.
Best regards
I am making an app with RoomPlan using the official sample code.
A model of the room is generated as it is scanned, and when the scan is complete, chairs and other objects are aligned parallel to the desk. I wanted to stop this behavior, so I decided to use the option beautifyObjects.
import UIKit
import RoomPlan
class RoomCaptureViewController: UIViewController, RoomCaptureViewDelegate, RoomCaptureSessionDelegate {
@IBOutlet var exportButton: UIButton?
@IBOutlet var doneButton: UIBarButtonItem?
@IBOutlet var cancelButton: UIBarButtonItem?
@IBOutlet var activityIndicator: UIActivityIndicatorView?
private var isScanning: Bool = false
private var roomCaptureView: RoomCaptureView!
private var roomCaptureSessionConfig: RoomCaptureSession.Configuration = RoomCaptureSession.Configuration()
private var roomBuilder: RoomBuilder!
private var processedResult: CapturedRoom?
private var finalResults: CapturedRoom?
override func viewDidLoad() {
super.viewDidLoad()
// Set up after loading the view.
setupRoomBuilder()
setupRoomCaptureView()
activityIndicator?.stopAnimating()
}
private func setupRoomBuilder() {
let beautifyObjectsEnabled = UserDefaults.standard.bool(forKey: "beautifyObjectsEnabled")
if beautifyObjectsEnabled {
roomBuilder = RoomBuilder(options: [.beautifyObjects])
} else {
roomBuilder = RoomBuilder(options: [])
}
}
private func setupRoomCaptureView() {
roomCaptureView = RoomCaptureView(frame: view.bounds)
roomCaptureView.captureSession.delegate = self
roomCaptureView.delegate = self
view.insertSubview(roomCaptureView, at: 0)
}
override func viewDidAppear(_ animated: Bool) {
super.viewDidAppear(animated)
startSession()
}
override func viewWillDisappear(_ flag: Bool) {
super.viewWillDisappear(flag)
stopSession()
}
private func startSession() {
isScanning = true
roomCaptureView?.captureSession.run(configuration: roomCaptureSessionConfig)
setActiveNavBar()
}
private func stopSession() {
isScanning = false
roomCaptureView?.captureSession.stop()
setCompleteNavBar()
}
// Decide to post-process and show the final results.
func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) -> Bool {
Task {
do {
let capturedRoom = try await roomBuilder.capturedRoom(from: roomDataForProcessing)
DispatchQueue.main.async {
self.finalResults = capturedRoom
self.exportButton?.isEnabled = true
self.activityIndicator?.stopAnimating()
}
} catch {
print("Error processing room data: \(error.localizedDescription)")
}
}
return true
}
// Access the final post-processed results.
func captureView(didPresent processedResult: CapturedRoom, error: Error?) {
finalResults = processedResult
self.exportButton?.isEnabled = true
self.activityIndicator?.stopAnimating()
}
@IBAction func doneScanning(_ sender: UIBarButtonItem) {
if isScanning { stopSession() } else { cancelScanning(sender) }
self.exportButton?.isEnabled = false
self.activityIndicator?.startAnimating()
}
@IBAction func cancelScanning(_ sender: UIBarButtonItem) {
navigationController?.dismiss(animated: true)
}
// Export the USDZ output by specifying the `.parametric` export option.
// Alternatively, `.mesh` exports a nonparametric file and `.all`
// exports both in a single USDZ.
@IBAction func exportResults(_ sender: UIButton) {
let destinationFolderURL = FileManager.default.temporaryDirectory.appending(path: "Export")
let destinationURL = destinationFolderURL.appending(path: "Room.usdz")
let capturedRoomURL = destinationFolderURL.appending(path: "Room.json")
do {
try FileManager.default.createDirectory(at: destinationFolderURL, withIntermediateDirectories: true)
let jsonEncoder = JSONEncoder()
let jsonData = try jsonEncoder.encode(finalResults)
try jsonData.write(to: capturedRoomURL)
try finalResults?.export(to: destinationURL, exportOptions: .parametric)
let activityVC = UIActivityViewController(activityItems: [destinationFolderURL], applicationActivities: nil)
activityVC.modalPresentationStyle = .popover
present(activityVC, animated: true, completion: nil)
if let popOver = activityVC.popoverPresentationController {
popOver.sourceView = self.exportButton
}
} catch {
print("Error = \(error)")
}
}
private func setActiveNavBar() {
UIView.animate(withDuration: 1.0, animations: {
self.cancelButton?.tintColor = .white
self.doneButton?.tintColor = .white
self.exportButton?.alpha = 0.0
}, completion: { complete in
self.exportButton?.isHidden = true
})
}
private func setCompleteNavBar() {
self.exportButton?.isHidden = false
UIView.animate(withDuration: 1.0) {
self.cancelButton?.tintColor = .systemBlue
self.doneButton?.tintColor = .systemBlue
self.exportButton?.alpha = 1.0
}
}
}
The func captureView(shouldPresent roomDataForProcessing: CapturedRoomData, error: Error?) is mainly changed. I have confirmed in the debugger that the roomBuilder options are changed according to the buttons in the UI.
Anyone who knows more about the behavior of this option, please give me an advice.