Yes. Its now possible. I have written a proof-of-concept for pulling usdz & reality files directly from usdzshare.com to Playgrounds App on Ipad. You can just copy and paste the code directly into Playgrounds app & the Reality scene at this location will => https://usdzshare.com/?ug-gallery=photo-detail&photo_id=4148 will be imported straight into your Playground from the web. I'm also working on an api for custom queries. So you could upload your own usdz or reality files to usdzshare.com sharing platforum and build your own test apps in Playgrounds on Ipad. Code here => https://usdzshare.com/reality-kit-in-playgrounds-app-on-ipad/
Post
Replies
Boosts
Views
Activity
Its most likely a collision box interference. The collision boxes in some cases are larger than your visible object. So if you have two objects near each other, its possible the object you are trying to interactively tap is being shielded by the other objects collision box. One way to examine the collision box for your asset - using showPhysics debug option in Reality kit. You could use Swift Playgrounds to build a simple Reality kit app that views your collision box. Unfortunately Reality Composer doesn’t show the collision box when building a scene. It will be nice to see Reality Composer add an option for viewing collision boxes in a scene.
Great question.... and I totally agree . As AR creators we need video materials incorporated into Reality Composer app and included in USDZ or reality file formats. I’ve been experimenting with video materials in Reality kit and already see huge potential with the feature in creating more captivating AR content.
AdamMax you can import images (including transparent pngs) into Reality Composer app. When working on the Ipad with Reality Composer app. You can drag the photos app or either Safari (split screen configuration) and then drag a photo across from the photo app or web page Safari browser into Reality Composer.
There are some Apple docs on memory constraints for AR Quick Look.
https://developer.apple.com/documentation/arkit/adding_visual_effects_in_ar_quick_look_and_realitykit
A GitHub doc written on compatibility issues with glTF is also useful reference source for certain constraints
https://github.com/google/usd_from_gltf#compatibility
I have found the best reference for animation capability in AR Quick Look is to use Apple’s Reality Composer app. Which exports both supported Reality & usdz file formats.
Visualising edges/lines/curves/points etc is not the use case AR Quick look is intended for... generally this kind of detail is only necessary for CAD tools. AR Quick Look is augmenting the virtual content into real-physical spaces... thats why Physically-based rendering is supported, because Apple want the virtual assets to look as realistic as possible so they blend into the environment with dynamic shadows, real-time environmental reflections and object occlusion etc.
Yes. A swiftUI implementation of Roomplan is available as a github repo => https://github.com/deurell/roomscanner
Getting the same problem.