VirtualEnvironmentProbeComponent implantation

I am trying to create reflections and lighting in a fully virtual scene, and the way the documentation is written, the VirtualEnvironmentProbeComponent would be key. I cannot seem to get it to work, however. Has anyone got this figured out?

Hi @mikeaskew4

Could you tell me what part of the process laid out in the VirtualEnvironmentProbeComponent documentation isn't working for you?

Also, here are a couple of things to keep in mind:

  • When loading an EnvironmentResource image, be sure to place the image in a folder whose name ends in .skybox and load it using the name of that folder (e.g. if your image is in a folder with the name "MyEnvironment.skybox", load the image with try await EnvironmentResource(named: "MyEnvironment"))
  • Make sure you're in an immersive space with its ImmersionStyle set to full in order to see the VirtualEnvironmentProbeComponent take effect.
  • Consider employing an ImageBasedLightComponent and an ImageBasedLightReceiverComponent to customize the lighting in your virtual scene as well.

Let me know if you have further questions!

Thanks for the reply—I am actually trying to implement this first in iOS 18 (will get to visionOS soon though).

I have been hoping since first seeing this that in a .nonAR ARView scene, this could be used to create an environment lighting resource from virtual objects, much like an AREnvironmentProbe generates a texture from real world surroundings. But I could be wrong…

Regardless, when I attach this to an entity it doesn’t seem to provide any new information or textures or lighting to that entity.

If there is a way to get some kind of virtual equirectangular image, via this component or not, I would love to know!

Hoping for some help on this one…

VirtualEnvironmentProbeComponent implantation
 
 
Q