Post not yet marked as solved
Could you share the console crash log for more context?
One idea coming to my mind:
updateImage() might be called on a non-main thread. And afaik ModelEntity should be created on the main- tread. So you could try marking the function as @MainActor.
Post not yet marked as solved
Same exact issue here, I think this is a bug on Apples side.
Any workarounds to fix this?
Instead of context.scene.performQuery did you try context.entities(matching: query, updatingSystemWhen: .rendering)?
Post not yet marked as solved
Feedback filed: FB13659588
Thank you!
Post not yet marked as solved
Hi, thanks so much for the quick reply!
I was afraid that would be the case, seems kinda wasteful with the mesh copy :/
Will definitely file an enhancement request!
I got it working now: https://streamable.com/7nakkb
But instead of using the ShaderGraphMaterial I put a plane behind the portal with an occlusion material.
AFAIK ShaderGraphMaterial does not allow me to discard fragments though, correct?
And I should probably be able to still use PBR materials instead of Unlit if I give both models the same ImageBasedLightReceiverComponent?
Regarding ClippingPlane I'm curious: what would be a sample use case for the clipping inside of the portal world?
Hi, I've found that you can retrieve the RealityKit Scene from an entity, once added (subscribe to the event). Then you can use the Scene's Raycast methods, e.g: https://developer.apple.com/documentation/realitykit/scene/raycast(from:to:query:mask:relativeto:)
For that to work though you would need to create CollisionShapes for either the detected planes or the reconstructed world mesh geometry.
Post not yet marked as solved
I just found this: https://developer.apple.com/documentation/arkit/worldtrackingprovider/4218774-querypose which sounds promising if I enable ARKit tracking.
Will give it a go. Can someone from the RealityKit confirm that this would be the way to go?
Also there is https://developer.apple.com/documentation/realitykit/anchoringcomponent/target-swift.enum/head
Does this also only work when ARKit is enabled? So far I wasn't able to run it successfully in the Simulator.
Hi, on first glimpse that looks to me like either you misspelled the function name of your geometry modifier or your metal file isn't being compiled.
If you are using the default metal library make sure that you have your geometry modifier function available like so:
#include <metal_stdlib>
#include <RealityKit/RealityKit.h>
using namespace metal;
[[visible]] void nameOfYourGeometryModifier(realitykit::geometry_parameters params) {
// … your shader
}
Post not yet marked as solved
You could use https://developer.apple.com/documentation/realitykit/arview/debugoptions-swift.struct/showfeaturepoints but that's rather aimed for debugging purposes. The other option I see would be to draw the points manually on top of your rendered scene with a custom post processing effect.
Post not yet marked as solved
Looks to me like you haven't configured EnvironmentTexturing in your ARConfiguration https://developer.apple.com/documentation/arkit/arworldtrackingconfiguration/2977509-environmenttexturing or you've set disableAREnvironmentLighting in the ARViews RenderOptions – https://developer.apple.com/documentation/realitykit/arview/renderoptions-swift.struct/disablearenvironmentlighting
I think RealityKit is unfortunately not supported in the Simulator, so it can't find the initializer. If you have an Apple Silicon may you could run RealityKit there though via the »My Mac (Designed for iPad)« option.
Possible Solutions:
public extension Transform {
// SceneKit Detour
var eulerAnglesQuickAndDirty: SIMD3<Float> {
let node = SCNNode()
node.simdTransform = matrix
return node.simdEulerAngles
}
// From: https://stackoverflow.com/questions/50236214/arkit-eulerangles-of-transform-matrix-4x4
var eulerAngles: SIMD3<Float> {
let matrix = matrix
return .init(
x: asin(-matrix[2][1]),
y: atan2(matrix[2][0], matrix[2][2]),
z: atan2(matrix[0][1], matrix[1][1])
)
}
}
Post not yet marked as solved
Alright, so my workaround for now is that I store the initial materials for my model in a dedicated animation component, define a target opacity and then always derive a new material for each animation step from the initial material and assign that to the model.
This way the texture persists. If I try to always dynamically query the current material from the model itself and adjust that, the texture is being purged after assigning a new blending value.
Nevertheless I feel like this is a bug that should be fixed.
Another observation;
On both iOS 15 and 16 the order when assigning blending is crucial:
Has no effect:
var unlitMaterial = UnlitMaterial()
unlitMaterial.blending = .transparent(opacity: 0.1)
unlitMaterial.color = .init(texture: .init(textureResource))
Works:
var unlitMaterial = UnlitMaterial()
unlitMaterial.color = .init(texture: .init(textureResource))
unlitMaterial.blending = .transparent(opacity: 0.1)
Post not yet marked as solved
Hi,
thanks for the reply! I tried that, but unfortunately it doesn't really work in my case (or only once) because when I create a new UnlitMaterial for an animation step, assign the opacity value and set it on the model, the texture is already gone when I try to create that new UnlitMaterial for another animation step.
The only way that worked was if I permanently store the texture resource somewhere and assign it manually. But I would really like to avoid that, because then I need to make a lot of assumptions in my code about the material my fade system is working on. A more generic approach would be good.
Maybe you can check out the sample project from my bug report in case I overlooked something?
Appreciate the support!
Post not yet marked as solved
I'm experiencing the same issues on Beta 4.
Disabling environment texturing prevents the memory from constantly growing.
Memory debugger still reports the same leaks as in the screenshots above. Wether they are false positives or not; it's at least irritating during the development process.
Additionally I found that if I run the default ARKit+RealityKit Xcode template project and enable sceneReconstruction = .mesh there are lot's of CFData leaks reported. No critical issue, but again I find it irritating during development and debugging. Screenshot attached.