Post

Replies

Boosts

Views

Activity

How to create Lunar Rover USDZ Animated Sample File
Hello! I’ve got a USDZ export from Maya pipeline working with animation, and they load up nicely in the Vision Pro. I’ve been checking out the animated sample files in the Augmented Reality/Quick Loop sample page, specifically, the first three at the top of the page. I would like to know how they are created. I’m a 3d modeler and animator, not a programmer, so dipping my toe in RCP and Xcode/SwiftUI, but could used some informative tutorials for proper workflow. For example, in the Lunar Rover sample, there are lines emanating from the model, then text windows appear. Would I need to create all these extras inside Reality Composer Pro? I’d like to start creating immersive, narrative experiences (both in a volume, and fully immersive) but for prototyping, I want to learn the proper way to add this type of functionality. I think I remember seeing something to do with “schemas” involved. I’m assuming there might be some coding to setup in RCP for when items are selected, then an associated animation is triggered. Can anyone point me towards the relevant documentation to help me get started? Remember, I don’t code. ;) Here are my recent Vision Pro experimentations. https://youtube.com/playlist?list=PLCH753rZ9r6eqXxpIemaSlcyYxjFgR210&si=P_7AY2aL97Upm61i I’m also proficient with Unreal Engine, but getting content packaged and over to AVP is still not ready for prime time, so i’m exploring the native approach. Thanks for helping point me in the right direction!
0
0
282
Jul ’24
Export USDZ With Unlit Shader From Maya
Hello! I have the great fortune to be working on a joint Apple/(unnamed brand) app. I am teamed up with a photogrammetry vendor, and will be assembling a scene in Maya, using a UDIM workflow from retouched assets in Substance Painter. This will be for an immersive environment, similar to Joshua Tree and the Hawaiian environments on AVP, which are great. We did a first pass, bringing USDZ files into Reality Composer Pro and sending to headset - very cool. However the models have a specular component that is washing everything out. After further researching, I learned that I need to not be using a Physically based shader, but instead, an Unlit shader. I had to manually create this in RCP, then it viewed properly. The issue is, every time I need to add a new asset, will I now have to manually create numerous custom shaders? Ideally, I am wanting to either figure out how to get an Unlit shader specified in Substance Painter that will get exported out properly, in addition to getting it to export from Maya. I tried using a Maya SurfaceShader - which is essentially unlit - but that does not export properly. I found that a Maya StandardSurface will export properly, however, it still imports as Physically Based shader. The conundrum is, I'll be working with a UDIM workflow, and would like to avoid having to manually create and hookup what could be between 10-40 textures per USDZ file. I guess what I'm trying to ask is - what is the preferred shader to use in Maya when exporting to USDZ that will import as Unlit? Or, is there a way to easily switch from Physically Based to Unlit inside RCP? And NOT doing it in Xcode/Swift, because the files I need to deliver need to be USDZ, the the developer will be assembling themselves. I'm using Maya because I need to work on my PC for the heavy GPU lifting, and also, it's just easier to assemble everything there VS Reality Composer Pro, which is like the iMovie of game engines. ;) (please take that constructively, it really needs to be more industry standard). I just need to make sure I can use Maya with the proper shaders to export my pieces, quickly send to RCP (with proper Unlit specification) then over to headset so I can check stuff in realtime with my photogrammetry vendor. Any help or advice is greatly appreciated! I'm really excited to be working on my first Vision Pro app. Thx!
4
0
515
Aug ’24
Need to increase 10' x 10' safe area for VR
Hello! I am working on some cool project reconstructions for a client. They design lobby sized installations with LED walls. I am being tasked with converting these over to AR at scale. I've got my first test in headset and it looks great! However, the desire to just walk around more than the 10' x 10' safe area zone totally takes one out of the immersive VR experience - which is pretty counter intuitive. Is there any way for us developers to by pass this hard limit, so that clients who are requesting more room-scale options can actually enjoy this in VR? Alternatively, is there a way to hook up a PS5 controller to a "player start" so I can navigate inside the VR volume? I'm really trying to embrace Reality Composer Pro, but it seems extremely limiting as I wait for Unreal Engine to get its act together. sigh. Thanks for any help or suggestions.
2
0
385
Sep ’24
How to setup PS5 Controller to move character with animation, and VR camera
Hello! I would like to do exactly this: https://youtu.be/Cun8K7ctKp0?si=TgWvtdw-VdlBVL0R I can't seem to find any documentation on getting a PS5 controller hooked up properly in Reality Composer Pro and Xcode, driving a character with animation (or, moving an object around freely) then over to the Vision Pro. Additionally, I would also like to learn how to use a controller to move a VR camera around a scene, so that we can navigate in custom built spaces - similar to Meta virtual environments, or Steam VR home environments. Typically, I would do this with Unreal Engine, but unfortunately, AVP support is still in its infancy there. So I figured, why not try to do it natively? Any help with concrete tutorials or documentation would be greatly appreciated. Thx!
1
0
212
Oct ’24