With Scene Understanding in RealityKit, do entities have collision detection for hands?

Was just curious if body parts such as hands can act as something that collides with entities in the scene, when collision detection and physics are enabled?
I had a similar thought, but the “What’s New in RealityKit” video says that real world objects should be treated as if they are static with infinite mass. So a moving hand would be treated more like its “teleporting” around every frame rather than an object with velocity and momentum that can apply a force.
Any people detected in the real world are removed from the scene geometry. If you have use cases for hand geometry, please consider submitting feedback using the Feedback Assistant. Details about your use case are helpful!

Note that you can detect hand pose with the Vision framework in iOS 14, which you may be able to use to apply your own physics to the RealityKit scene.
With Scene Understanding in RealityKit, do entities have collision detection for hands?
 
 
Q