I just need to prioritize the display of the watch over the hands.
I remember that it appeared in the official tutorial that during the collision test, the finger joints were set to a small round ball with physical properties. During the debugging phase, in order to prevent the small ball from being blocked by the hand, a certain setting can be used to allow the small ball to be displayed first.
I couldn't find this official tutorial later. Does it really have this function?
Post
Replies
Boosts
Views
Activity
I have submitted bug report: FB15118055
I know that preparing multiple .skybox folders in the project ahead of time can dynamically change the ImageBasedLight during runtime.
But my entire scene usdz is downloaded from the url, so my HDR picture is also downloaded from the url. How to turn the HDR image downloaded in the URL into an EnvironmentResoure and then generate an ImageBasedLightComponent
Thank you very mush
I will provide the developed App to my customers for experience, and the customers are using it for the first time, so I must reduce the learning steps as much as possible, so I chose not to use the eye tracking function .
I want to move the Volumetic directly using a very simple custom gesture, without even needing the Volumetic to be selected
So, is there a way to programmatically control the volumetric types of a WindowGroup to move in space?
usdzip -x myModel.usdz -o myModelFolder
I found that the above command cannot be executed
There is no -x parameter
Why?
~ % usdzip -help
usage: usdzip [-h] [-r] [-a ASSET] [--arkitAsset ARKITASSET] [-c]
[-l [LISTTARGET]] [-d [DUMPTARGET]] [-v]
[usdzFile] [inputFiles ...]
Utility for creating a .usdz file containing USD assets and for inspecting
existing .usdz files.
I found that Meters Per Unit has been set to 1, but in actual operation, the actual size corresponding to scale=1 is still not 1 meter. I measured it to be approximately 0.2 meters. Why is this happening? Where else do I need to set up?