Post

Replies

Boosts

Views

Activity

Program finger/hand gesture for snap turn/teleport navigation
Hello! Currently watching the Envision the Future: Build great apps for visionOS" webinar, and lots of questions coming up. Thx for offering this online! For those of us with "VR legs", how can we go about setting up custom hand/finger gestures that would enable us to add the functionality for teleporting and navigating within our fully Immersive environments? Both smooth, and snap turn/teleport options would be great, thx! This is adjacent to my previous question on how to setup a PS5 controller to do something similar. Think Half-Life: ALYX as the gold standard for VR navigation.
0
0
55
13h
Control Player Camera with PS5 Controller on Vision Pro
I recently completed a freelance project where I was tasked with creating room-scale environments that could be used as AR elements. As a bonus, I suggested that these could be done to scale, and repurposed for eventual viewing in Vision Pro. To illustrate, I was able to quickly create a simple Immersive project in Xcode, add the USDZ file (authored in Maya, with baked lighting from Arnold) to Reality Composer Pro, and compile for quick sending to headset. I then would do screen recordings inside the immersive space, which the client loved to see. However, I am unable to walk around due to the boundary limitations. My next obvious thought is, how can I setup the “player” camera so that I can control with a PS5 controller inside AVP? In addition to Maya, I’m an Unreal Engine artist, and have been waiting patiently to get any projects compiled for AVP. With 5.5 release, I was able to get a VR Template test over to AVP, where I have rudimentary navigation control via the PS5 controller. Ideally, I’d also love to learn how to set this up natively, so I can take simple USDZ scenes created in Maya, import to RCP, setup a simple camera controller, and then be able to use this to navigate my VR Immersive spaces on Vision Pro. How can we go about doing this? Part two of this question/suggestion is, how would I go about controlling a rigged, animated character in AR/passthrough mode in a similar fashion? Thx!
0
0
56
14h
How to setup PS5 Controller to move character with animation, and VR camera
Hello! I would like to do exactly this: https://youtu.be/Cun8K7ctKp0?si=TgWvtdw-VdlBVL0R I can't seem to find any documentation on getting a PS5 controller hooked up properly in Reality Composer Pro and Xcode, driving a character with animation (or, moving an object around freely) then over to the Vision Pro. Additionally, I would also like to learn how to use a controller to move a VR camera around a scene, so that we can navigate in custom built spaces - similar to Meta virtual environments, or Steam VR home environments. Typically, I would do this with Unreal Engine, but unfortunately, AVP support is still in its infancy there. So I figured, why not try to do it natively? Any help with concrete tutorials or documentation would be greatly appreciated. Thx!
1
0
252
Oct ’24
Need to increase 10' x 10' safe area for VR
Hello! I am working on some cool project reconstructions for a client. They design lobby sized installations with LED walls. I am being tasked with converting these over to AR at scale. I've got my first test in headset and it looks great! However, the desire to just walk around more than the 10' x 10' safe area zone totally takes one out of the immersive VR experience - which is pretty counter intuitive. Is there any way for us developers to by pass this hard limit, so that clients who are requesting more room-scale options can actually enjoy this in VR? Alternatively, is there a way to hook up a PS5 controller to a "player start" so I can navigate inside the VR volume? I'm really trying to embrace Reality Composer Pro, but it seems extremely limiting as I wait for Unreal Engine to get its act together. sigh. Thanks for any help or suggestions.
2
0
403
Sep ’24
Export USDZ With Unlit Shader From Maya
Hello! I have the great fortune to be working on a joint Apple/(unnamed brand) app. I am teamed up with a photogrammetry vendor, and will be assembling a scene in Maya, using a UDIM workflow from retouched assets in Substance Painter. This will be for an immersive environment, similar to Joshua Tree and the Hawaiian environments on AVP, which are great. We did a first pass, bringing USDZ files into Reality Composer Pro and sending to headset - very cool. However the models have a specular component that is washing everything out. After further researching, I learned that I need to not be using a Physically based shader, but instead, an Unlit shader. I had to manually create this in RCP, then it viewed properly. The issue is, every time I need to add a new asset, will I now have to manually create numerous custom shaders? Ideally, I am wanting to either figure out how to get an Unlit shader specified in Substance Painter that will get exported out properly, in addition to getting it to export from Maya. I tried using a Maya SurfaceShader - which is essentially unlit - but that does not export properly. I found that a Maya StandardSurface will export properly, however, it still imports as Physically Based shader. The conundrum is, I'll be working with a UDIM workflow, and would like to avoid having to manually create and hookup what could be between 10-40 textures per USDZ file. I guess what I'm trying to ask is - what is the preferred shader to use in Maya when exporting to USDZ that will import as Unlit? Or, is there a way to easily switch from Physically Based to Unlit inside RCP? And NOT doing it in Xcode/Swift, because the files I need to deliver need to be USDZ, the the developer will be assembling themselves. I'm using Maya because I need to work on my PC for the heavy GPU lifting, and also, it's just easier to assemble everything there VS Reality Composer Pro, which is like the iMovie of game engines. ;) (please take that constructively, it really needs to be more industry standard). I just need to make sure I can use Maya with the proper shaders to export my pieces, quickly send to RCP (with proper Unlit specification) then over to headset so I can check stuff in realtime with my photogrammetry vendor. Any help or advice is greatly appreciated! I'm really excited to be working on my first Vision Pro app. Thx!
4
0
569
Aug ’24
How to create Lunar Rover USDZ Animated Sample File
Hello! I’ve got a USDZ export from Maya pipeline working with animation, and they load up nicely in the Vision Pro. I’ve been checking out the animated sample files in the Augmented Reality/Quick Loop sample page, specifically, the first three at the top of the page. I would like to know how they are created. I’m a 3d modeler and animator, not a programmer, so dipping my toe in RCP and Xcode/SwiftUI, but could used some informative tutorials for proper workflow. For example, in the Lunar Rover sample, there are lines emanating from the model, then text windows appear. Would I need to create all these extras inside Reality Composer Pro? I’d like to start creating immersive, narrative experiences (both in a volume, and fully immersive) but for prototyping, I want to learn the proper way to add this type of functionality. I think I remember seeing something to do with “schemas” involved. I’m assuming there might be some coding to setup in RCP for when items are selected, then an associated animation is triggered. Can anyone point me towards the relevant documentation to help me get started? Remember, I don’t code. ;) Here are my recent Vision Pro experimentations. https://youtube.com/playlist?list=PLCH753rZ9r6eqXxpIemaSlcyYxjFgR210&si=P_7AY2aL97Upm61i I’m also proficient with Unreal Engine, but getting content packaged and over to AVP is still not ready for prime time, so i’m exploring the native approach. Thanks for helping point me in the right direction!
0
0
292
Jul ’24