I created an app where the user:
- first scans a room using
RoomCaptureView
(RoomPlan) - then taps on physical elements (objects, walls...) using an
ARView
to record some 3d positions
I can handle taps in an ARView using a UITapGestureRecognizer
and the ARView raycast(from:, allowing:, alignment:)
method. This works fine, so I thought I could do the same using the ARView used by RoomCaptureView., so the user can scan a room and record some 3d positions at the same time. Sadly, this approach does not work, as the raycast
method always returns nil
.
What I actually need is mapping a tap on screen to a real-world position during RoomCaptureSession.
Does anyone know how to do this?