Post

Replies

Boosts

Views

Activity

Reply to VisionOS IBL (ImageBasedLighting) BW only or Coloured? File formats? Shadows?
To answer my own post with my current understanding on things: 1- Does IBL need to be BW or will colour work? It can be in colour. As an example you can use Reality Converter to test multiple different IBLs, some b/w some coloured: 2- What is the best file format for IBL? Any pros/cons? Or should we just test out each format and check visually. From my tests: PNG, OpenEXR (.exr), Radiance HDR (.hdr) all work. But which format is recommended? It "depends". Just test and see until you have the ambience you are looking for. 3- Will IBL on visionOS create shadows for us? In Blender an HDRI gives shadows. No. There is currently no clear information what the best way is to approach adding shadows to a scene in RealityKit. 4 Looking at a scene in Blender which uses HDRI as global lighting, how can we best "prep" the IBL image that will give the closest light similar to Blender's Cycles rendering engine? Just test and see until you have the ambience you are looking for.
Feb ’24
Reply to VisionOS IBL (ImageBasedLighting) BW only or Coloured? File formats? Shadows?
@Wittmason with "shadow baking" I meant "light baking". What I really mean is that in the Blender renderer (the screenshots at the bottom) you can see beautiful shadows. They are not generated when running on the  Vision Pro. An IBL will "tint" our assets' colour, but it will not add shadows. I think the only solution to have shadows is to replace the textures with textures that have those shadows baked in. I guess they call this "light baking" in Unity/Unreal, but Blender doesn't have a good workflow for this. My current understanding: an asset texture as a JPG is sometimes a smaller image used in repeated pattern on an asset other times an asset texture is just colour/texture data, not using any jpg In consideration of the above, what I understand the limitations of light baking to be: When baking light onto a JPG texture (so you can see a shadow as part of that texture), then you basically need one big texture jpg per asset and cannot use any "repeated pattern" any more. When baking light onto a texture that doesn't use any JPG but instead colour/texture data, then a newly JPG will need to be generated for that. The above 1 & 2 makes it so the usdz file size will grow tremendously because of the new and large sizes texture JPGs... I believe that Unity/Unreal therefore don't "bake light onto a texture" but instead they create a "lightmap" which is another layer that can be added on top of an asset with data on which parts need to be lightened/darkened. I am guessing this is a more economical method to do so, but I have no idea how these lightmaps are supported by RealityKit /  Vision Pro.
Feb ’24
Reply to Add light to a RealityView on visionOS
@gchiste Do you have other examples of IBL that I can reference to try and understand how to better add lighting to my scenes for the full immersion mode? I followed the video and checked the example code. I also found this example code on Stack Overflow: https://stackoverflow.com/questions/76755793/spotlightcomponent-is-unavailable-in-visionos/76761509#76761509 I have a scene with a bunch of lights in Blender, but none of them auto translate to IBL and I'm not sure how/where to start learning about how to convert my Blender lights to IBL for visionOS. I can't find any other examples to better understand how to create an IBL for a lantern here or the sun there as I can see it in my Blender scene. Any official direction from Apple would be greatly appreciated!
Nov ’23
Reply to How to impose an Impulse
@kevyk yes, but the issue was that applyLinearImpulse doesn't exists on the parent entity. I was able to get around it by doing this: let parentEntityHasPhyics = parentEntity as? HasPhysicsBody { parentEntityHasPhyics.applyLinearImpulse(...) // now we can use applyLinearImpulse without getting a compile-time error }
Sep ’23
Reply to How to impose an Impulse
@gchiste What if we have a transform with 3 ModelEntities inside in reality composer pro. Something like this: and since the three model entities represent a single object, can I execute applyImpulse to the "transform" layer named Body or do I have to do it to each individual model entity? (those three nested ones) Because I believe my transform layer called "Body" is technically an Entity, and not a ModelEntity, right?
Sep ’23
Reply to Anchoring Objects to Surfaces in the Shared Space
I stumbled upon the same limitations. I decided I'll just make two versions of my app, leave the choice up to the user: have them choose a volume, and they'll need to put it somewhere but can multi-task have them choose a full space, and my app will auto anchor to their table, but they cannot multi-task anymore I don't think there's much more we can do with these current limitations. Maybe next year things will be improved and limitations might get lifted. In the simulator, it seems that when I drag the volume around to try to place it on a surface, geometry inside of a RealityView can clip through “real” objects. Is this the expected behavior on a real device too? as far as my understanding goes, I think yes, this is the expected behaviour If so, could using ARKit in a Full Space to position the volume, then switching back to a Shared Space, be an option? as far as my understanding goes, I think no, because I don't think we can anchor volumetric windows. that's the main limitation. So you'd have to work with just plain entities that you anchor, but then switching back to a volume will make it so those entities disappear. Also, if the app is closed, and reopened, will the volume maintain its position relative to the user’s real-world environment? I think the volume will stay where the user put it yes, but this is at apple's regression on how they choose to implement the behaviour of volumes in the shared space. it might change at any time and developers won't be able to have a say in it. Think of it as, we don't know where on a macbook screen and what size a window is exactly located at. Same for a volumetric window.
Sep ’23