Post

Replies

Boosts

Views

Activity

Anchoring Objects to Surfaces in the Shared Space
Hi there! I am working on an app that is designed to be a 3D “widget” that could sit on someone’s desk or shelf and continue to sit there even when a user isn’t actively interacting with it, as if it was a real object on a shelf. Is there any way to place objects on surfaces for apps running in the Shared Space? From what I understand the position of a volume is determined by the user, and cannot be changed programatically. I understand Full Space apps have access to anchors and plane detection data, but I don’t want people to have to close everything else to use my app. A few questions: In the simulator, it seems that when I drag the volume around to try to place it on a surface, geometry inside of a RealityView can clip through “real” objects. Is this the expected behavior on a real device too? If so, could using ARKit in a Full Space to position the volume, then switching back to a Shared Space, be an option? Also, if the app is closed, and reopened, will the volume maintain its position relative to the user’s real-world environment? Thanks!
1
1
796
Sep ’23
How does Encounter Dinosaurs blend between portal lighting and real-world lighting?
When the dinosaur protrudes from the portal in the Encounter Dinosaurs app, it appears to be lit by the real room lighting, just like any other RealityKit content is by default. When the dinosaur is inside of the portal, it appears to be lit by the virtual environment, and the two light sources seem to be smoothly blended between at the plane of the portal. How is this done? ImageBasedLightReceiverComponent allows the IBL to be changed on a per-entity basis, but the actual lightning calculation shader code seems to be a black box, and I have not seen a way to specify which IBL texture is used on a per-fragment basis.
0
0
690
May ’24