Hello,I'm having a blast learning SwiftUI but as a designer, and brand new developer, I'm really struggling with linking things and I don't quite have the vocabulary to follow Apple's documentation. With SwiftUI, I'd just love to have a button middle of the screen--I tap it and an AR Quick Look USDZ pops up.Could somebody point me in the right direction, connecting Quick Look to a button? And pretend like you're talking to a baby?https://developer.apple.com/documentation/arkit/previewing_a_model_with_ar_quick_lookThank you!-Dan
Post
Replies
Boosts
Views
Activity
Hello,
I’m trying to bring a USDZ model of a radio tower with a red light on top (using the emissive channel in my USDZ) into Reality Composer.
As of right now, it looks like RC ignores emissive materials.
If I finish all the interaction I want in RC, is there an easy way to add a bloom post-processing effect in XCode to make my red light glow?
I saw there is Post Processing in Reality Kit 2 but as a new developer, not sure how or if it is possible to “dit into” my RC project with XCode to add that post processing. If anybody could provide a good beginners tutorial or a path to start learning, I’d really appreciate it.
Thank you,
Daniel Jones
While it seems like this is being aimed at indoor seated use initially, is there going to be anything stopping us from making an outdoor art experience? I'd love to make an installation that a user could experience in a soccer field, etc.
I’m trying to make a pass-through experience where there is an emissive/glow-y/bloom-y object. Everything I’ve tried and researched so far, it looks like that won’t be possible with RealityKit rendering and I’ll have to use Unity with its URP. But will it be possible to use that renderer while still having pass-through video to ground the experience in the real world? Or is that only possible with fully immersive experiences?
if there’s a completely different approach that is better, I’m willing and wanting to learn.
Thank you in advance!
-Dan