Post

Replies

Boosts

Views

Activity

Include RealityKit in iOS 12 project for iOS 13 users
In my application, we want to add an optional RealityKit experience for our customers who have upgraded their OS without forcing them to do so. I was going through my code adding @available attributes to all of my classes that accessed iOS 13-specific features, but I discovered in the process that a bunch of my errors were in the generated RealityKit class files which I cannot modify. Is there any way to produce a build which includes RealityKit when the target version is below iOS 13?
1
0
881
Jul ’20
Does estimatedDepthData take advantage of LiDAR?
I am making use of personSegmentationWithDepth in my app, and the actual occlusion in my RealityKit implementation works very well. But when I retrieve values from the estimatedDepthData for a particular point, it has trouble beyond a depth of about 1.5-1.75 meters. Once I get past that depth, it returns a depth value of 0 meters most of the time. With that said when it does return a value the value appears to be accurate. I am working with a LiDAR-equipped iPad on a tripod running iOS 14 beta 2. Does the stillness of the tripod affect my limited data returned? And since estimatedDepthData predates iOS 13.5, I'm also wondering if it is taking advantage of the newer hardware.
3
0
1.3k
Jul ’20
Matching Virtual Object Depth with ARFrame Estimated Depth Data
I am trying to do a hit test of sorts between a person in my ARFrame and a RealityKit Entity. So far I have been able to use the position value of my entity and project it to a CGPoint which I can match up with the ARFrame's segmentationBuffer to determine whether a person intersects with that entity. Now I want to find out if that person is at the same depth as that entity. How do I relate the SIMD3 position value for the entity, which is in meters I think, to the estimatedDepthData value?
9
0
2.5k
Jun ’20
Best way to include dynamic elements in RealityKit
I am trying to work around RealityKit's limited code-generated elements, and looking for recommendations of creating dynamic elements that can change in response to other code. Here are some ideas I am considering: create pre-built models and animations for them which I can skip to particular frames in the code depending on data coordinating with a UIView or SKView subview, unprojecting points from the 3D space to 2D to align the elements Are there other approaches to consider, especially ones that would allow me to include 3D elements.
4
0
1.5k
Jun ’20
Switching configuration for occlusion and motion capture
In the absence of the ability to simultaneously use ARKit motion capture and people occlusion with depth, I am brainstorming ways that I can still make use of virtual objects in my app in more limited ways. One idea I am considering is using the occlusion configuration until I detect the person is in a particular position and switching configurations to motion capture. Is that switch going to cause me problems with loss of world anchor or other disruptive experiences for the user? How seamless will mode switching appear?
1
0
460
Jun ’20
Crash with Debugger when using LiDAR
I frequently crash out when I run an AR session using my LiDAR iPad Pro. The error is:[ADMillimeterRadiusPairsLensDistortionModel applyDistortionModelToPixels:inPixels:intrinsicsMatrix:pixelSize:distort:outPixels:] ().  The crash only happens when debugging, so I expect it is a resource issue. To try to fix it, sometimes I restart Xcode, sometimes I force quit the app, sometimes I unplug and replug the device from the computer. Nothing consistently works that I've found. Anything I can do to avoid it?
3
0
893
Jun ’20