Hello!
You can access the camera from the ARSession of the RoomPlan.
func captureSession(_ session: RoomCaptureSession, didUpdate room: CapturedRoom) {
guard let currentFrame = session.arSession.currentFrame?.camera else { return }
// use camera here
}
This example is on captureSession - didUpdate but you can get it from any method from RoomCaptureSessionDelegate, more specifically from RoomCaptureSession object.
Post
Replies
Boosts
Views
Activity
I made a typo here, should have been camera, not currentFrame.
[quote='798161022, spaceshrooms, /thread/758195?answerId=798161022#798161022, /profile/spaceshrooms']
currentFrame
[/quote]
[quote='798161022, spaceshrooms, /thread/758195?answerId=798161022#798161022, /profile/spaceshrooms']
.
[/quote]
I am also looking into the same use case atm.
I tried to map the images to the UUID of the detected structure (wall, door, opening, object etc.) but then again, when merging into a multi-room structure, the room builder creates new UUIDs for the common walls and openings, and I haven't found a way to map the previous UUIDs to the new updated UUID.
I am trying now the approach to snap the image and map it to the camera position, in order to access them via position in the final result.
Did you found any solutions yet to the multi-room structure issue?
If you use arSession you can get the point cloud as raw feature points from each AR frame then store them together as mapped to the scanned room.
Did you found any alternative solutions to this issue since you posted the question?
func session(_ session: ARSession, didUpdate frame: ARFrame) {
print(frame.rawFeaturePoints)
}
rawFeaturePoints
The current intermediate results of the scene analysis ARKit uses to perform world tracking.
var rawFeaturePoints: ARPointCloud? { get }
These points represent notable features detected in the camera image. Their positions in 3D world coordinate space are extrapolated as part of the image analysis that ARKit performs in order to accurately track the device's position, orientation, and movement. Taken together, these points loosely correlate to the contours of real-world objects in view of the camera.
ARKit does not guarantee that the number and arrangement of raw feature points will remain stable between software releases, or even between subsequent frames in the same session. Regardless, the point cloud can sometimes prove useful when debugging your app's placement of virtual objects into the real-world scene.
If you display AR content with SceneKit using the ARSCNView class, you can display this point cloud with the showFeaturePoints debug option.
Anyone found a solution to this use-case?
You need to either stop the room capture and keeping the AR Session alive:
func stopSession(pauseARSession: Bool) {
uiViewController?.roomCaptureSession.stop(pauseARSession: pauseARSession)
}
then proceed to start the room capture again;
or use relocalization (see explanation below):
https://developer.apple.com/videos/play/wwdc2023/10192/?time=450
Hello, did anyone found a solution or any hint for the Stairs issue?
If so, can you please write it here?