Can anyone provide or point me to example code to fade in / out spotlights over 1 second?
Did not find anything on this topic in the docs:
https://developer.apple.com/documentation/realitykit/spotlight
Post
Replies
Boosts
Views
Activity
What’s the difference between an action and an animation eg.: FromToByAnimation vs FromToByAction.
The documentation on them is pretty similar and I'm not understanding the differences exactly... : S
FromToByAnimation → https://developer.apple.com/documentation/realitykit/fromtobyanimation?changes=__2_2
FromToByAction → https://developer.apple.com/documentation/realitykit/fromtobyaction?changes=__2_2
As developer, when should we reach out to use an animation vs action ? 🤔
What's the difference between project dependencies and target dependencies?
Project dependencies can be added here:
Target dependencies can be added here:
in the case of external dependencies (via URL), do we need to add it to both project and target, or is one of those enough?
in the case of local dependencies (via the workspace), do we need to add it to both project and target, or is one of those enough?
I have an immersive environment with a skybox which uses an png image inside a sphere. I added an IBL, but I am not sure what the best format / prep method is for the IBL image.
I have tried several different images for my IBL, and all are very different vibes from what I have in Blender.
My question is how can I create an IBL that's closest to Blender's Cycles rendering engine?
However, it's a rather difficult to answer question, so I want to ask some smaller questions first.
Does IBL need to be BW or will colour work?
From my tests: colour works just as well. But why does Apple only show use of BW ones? Should we use BW?
What is the best file format for IBL? Any pros/cons? Or should we just test out each format and check visually.
From my tests: PNG, OpenEXR (.exr), Radiance HDR (.hdr) all work. But which format is recommended?
Will IBL on visionOS create shadows for us? In Blender an HDRI gives shadows.
From my tests: No, IBL does not provide shadows on your loaded environment/meshes. Is "shadow baking" the only solution for the time being?
Looking at a scene in Blender which uses HDRI as global lighting, how can we best "prep" the IBL image that will give the closest light similar to Blender's Cycles rendering engine?
From my tests: I tried (as shown below)
A) make a render of just the Blender HDRI (without meshes) via 360-degree panoramic camera.
→ Usage as IBL makes everything too bright.
B) make a render of the entire Blender scene via 360-degree panoramic camera.
→ Usage as IBL makes everything washed out and yellowish.
C) Use the Sunlight.png from the sample project.
→ With this IBL the scene is too dark.
D) Use the SystemIBL.exr from another sample project.
→ With this IBL the scene looks very flat and not realistic at all.
Here I show each IBL I described above 1~4 and sample screenshots from the simulator:
A)
B)
C)
D)
The atmosphere I'm aiming for as per Blender's Cycles rendering engine:
Can anyone help me with my questions 1 ~ 4 above.
It would give me some insight in how to create immersive environments with realistic lighting & shadows. : )
Much appreciated!
— Luca
SharePlay & Group Activities
I was able to implement entity position synchronisation via SharePlay (Group Activities) in my visionOS app by following the tutorials on SharePlay in the "Draw Together" app from these WWDC sessions:
https://developer.apple.com/wwdc21/10187
https://developer.apple.com/wwdc22/10140
While referencing the sample code at: https://developer.apple.com/documentation/groupactivities/drawing_content_in_a_group_session
MultipeerConnectivityService
However, it seems that RealityKit has something called MultipeerConnectivityService for Entity position synchronisation and it seems to be a pretty robust solution that will sync not only positions but also other things like Codable components. 🤔
See docs at: https://developer.apple.com/documentation/realitykit/multipeerconnectivityservice
Call for help
Can anyone share example code that implements MultipeerConnectivityService ?
I wonder if this is the recommended approach by Apple?
I must say, writing custom messages to sync the Entity positions via Group Activities was very hard 😅 I was just thinking what I should do for all the entity components now...
The error I get with visionOS simulator:
cannot migrate AudioUnit assets for current process
code:
guard let resource = try? AudioFileGroupResource.load(
named: "/Root/AudioGroupDropStone",
from: "Scene.usda",
in: realityKitContentBundle
)
Any ideas how to debug this?
The audio files seem to work fine in Reality Composer Pro.
'init(make:update:attachments:)' is unavailable in visionOS
in Xcode 15 beta 8, but it's fine in beta 7
So there's a "grounding shadow" component we can add to any entity in Reality Composer Pro.
In case my use case is Apple Vision Pro + Reality Kit:
I'm wondering, by default I think I'd want to add this to any entity that's a ModelEntity in my scene... right?
Should we just add this component once to the root transform?
Or should we add it to any entity individually if it's a model entity?
Or should we not add this at all? Will RealityKit do it for us?
Or does it also depend if we use a volume or a full space?
So if I drag an entity in RealityView I have to disable the PhysicsBodyComponent to make sure nothing fights dragging the entity around. This makes sense.
When I finish a drag, this closure gets executed:
.gesture(
DragGesture()
.targetedToAnyEntity()
.onChanged { e in
// ...
}
.onEnded { e in
let velocity: CGSize = e.gestureValue.velocity
}
If I now re-add PhysicsBodyComponent to the component I just dragged, and I make it mode: .dynamic it will loose all velocity and drop straight down through gravity.
Instead the solution is to apply mode: .kinematic and also apply a PhysicsMotionComponent component to the entity. This should retain velocity after letting go of the object.
However, I need to instatiate it with PhysicsMotionComponent(linearVelocity: SIMD3<Float>, angularVelocity: SIMD3<Float>).
How can I calculate the linearVelocity and angularVelocity when the e.gestureValue.velocity I get is just a CGSize?
Is there another prop of gestureValue I should be looking at?