USDZ is a 3D file format that shows up as AR content on a website.

USDZ Documentation

Posts under USDZ tag

71 Posts
Sort by:
Post not yet marked as solved
3 Replies
817 Views
I have a USDZ model called 'GooseNModel' in the visionOS App project. I'm sure that this model contains an animation, so I wrote the following code to display the model with animation: import SwiftUI import RealityKit RealityView{ content in if let GooseNModel = try? await Entity(named: "GooseNModel"), let animation = GooseNModel.availableAnimations.first { GooseNModel.playAnimation(animation) content.add(GooseNModel) } } But when I ran it, I found that it did not contain animation as I imagined, but a static model. How to solve it?
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
1 Replies
639 Views
Hey everyone, I'm running into this issue of my USDZ model not showing up in Reality Composer Pro, exported from Blender as a USD and converted in Reality Converter. See Attached image: It's strange, because the USDz model appears fine in Previews. But once it is brought inside RCP, I receive this pop up, and does not appear. Not sure how to resolve this multiple root level issue. If anyone can point me in the right direction or any feedback, all is much appreciated! Thank you!
Posted Last updated
.
Post not yet marked as solved
1 Replies
776 Views
In reality composer pro, when importing an USDZ model and inserting it into the scene, reality composer pro will remove the material of the model itself by default, but I don't want to do this. So how can reality composer pro not remove the material of the model itself?
Posted
by lijiaxu.
Last updated
.
Post not yet marked as solved
1 Replies
427 Views
Hello fellow developers, I am currently exploring the functionality of the UsdPrimvarReader node in Shader Graph Editor and would appreciate some clarification on its operational principles. Despite my efforts to understand its functionality, I find myself in need of some guidance. Specifically, I would appreciate insights into the proper functioning of the UsdPrimvarReader node, including how it should ideally operate, the essential data that should be specified in the Varname field, and the Primvars that can be extracted from a USD file. Additionally, I am curious about the correct code representation of a Primvar in USD file to ensure it can be invoked successfully. If anyone could share their expertise or point me in the right direction to relevant documentation, I would be immensely grateful. Thank you in advance for your time and consideration. I look forward to any insights or recommendations you may have.
Posted
by TimFoster.
Last updated
.
Post not yet marked as solved
1 Replies
456 Views
I am trying to use my animated model in XCode with SceneKit. I exported my model from Maya with Animation Data in .usd format, then converted it to .usdz with Reality Converter. When I open it in XCode viewer it is animated and everything is fine. However when I try to use it in my app it doesn't animate. On the other hand, when I try with the robot_walk_idle model from Apple's example models, it is animated. Maybe I am missing a option in export settings. Thanks for any help. import SwiftUI import SceneKit struct ModelView: View { var body: some View{ VStack{ SceneView(scene: SCNScene(named: "robot_walk_idle.usdz")) } } }
Posted Last updated
.
Post not yet marked as solved
2 Replies
430 Views
I have a Entity which is loading a USDZ asset. let modelName = "example.usdz" guard let modelURL = Bundle.main.url(forResource: modelName, withExtension: nil) else { fatalError("Failed to find model file: \(modelName)") } let videoEntity = try! Entity.load(contentsOf: modelURL) Now i am creating a video Material using AVPlayer, VideoMaterial let asset = AVURLAsset(url: url) let playerItem = AVPlayerItem(asset: asset) let player = AVPlayer() let material = VideoMaterial(avPlayer: player) And adding Video material to a sphere videoEntity.components.set(ModelComponent(mesh: .generateSphere(radius: 1E3), materials: [material])) But I want a mesh of my 3d model as shape. Any Possible Ways to Achieve This.
Posted
by varxcx.
Last updated
.
Post marked as solved
2 Replies
598 Views
In previous versions of Xcode, there was a simple option under Editor > Convert to .scn when a .usdz file was selected. I am unable to find this option anymore as all I see when selecting a .usdz file is "Previous Page", "Next Page", "Play", "Pause". Please provide some insight on how to convert a .usdz file to .scn in Xcode 15. Thanks!
Posted
by cec5550.
Last updated
.
Post not yet marked as solved
2 Replies
658 Views
Hello Apple community, I hope this message finds you well. I'm writing to report an issue that I've encountered after upgrading my iPad to iPadOS 17. The problem seems to be related to the Quick Look AR application, which I use extensively for 3D modeling and visualization. Prior to the upgrade, everything was working perfectly fine. I create 3D models in Reality Composer and export them as USDZ files for use with Quick Look AR. However, after the upgrade to iPadOS 17, I've noticed a rather troubling issue. Problem Description: When I view my 3D models using Quick Look AR on iPadOS 17, some of the 3D models exhibit a peculiar problem. Instead of displaying the correct textures, they show a bright pink texture in their place. This issue occurs only when I have subsequent scenes added to the initial scene. Strangely, the very first scene in the sequence displays the textures correctly. Steps to Reproduce: Create a 3D model in Reality Composer. Export the model as a USDZ file. Open the USDZ file using Quick Look AR. Observe that the textures appear correctly on the initial scene. Add additional scenes to the model. Navigate to the subsequent scenes. Notice that some of the 3D models display a pink texture instead of the correct textures (see picture). Expected Behavior: The 3D models should consistently display their textures, even when multiple scenes are added to the scene sequence. Workaround: As of now, there doesn't seem to be a viable workaround for this issue, which is quite problematic for my work in 3D modeling and visualization. I would greatly appreciate any insights, solutions, or workarounds that the community might have for this problem. Additionally, I would like to know if others are experiencing the same issue after upgrading to iPadOS 17. This information could be helpful for both users and Apple in addressing this problem. Thank you for your attention to this matter, and I look forward to hearing from the community and hopefully finding a resolution to this Quick Look AR issue. Best regards
Posted
by delcasda.
Last updated
.
Post not yet marked as solved
5 Replies
1k Views
Hi friends, my desired outcome is to import a USDZ file to visionOS simulator on XCode and to see how it looks via the visionOS Simulator. I've imported a USDZ file and I can see it in the Swift IDE, but when I click on ContentView all I see is the default Apple living room. I have taken the following steps: -the file is included in the "Copy Bundle Resources" phase of my target -Selected my project in the navigator. -Selected my app's target. -went to the "Build Phases" tab. -Expanded the "Copy Bundle Resources" section and the USDZ file is there if I click on the actual file I can see the full USDZ model floating on the 'grid'. It's currently under Assets > ContentView > filename.usdz Any suggestions/help are greatly appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
591 Views
Creating an Entity and then changing between different animations (walk, run, jump, etc.) is pretty straightforward when you have individual USDZ files in your Xcode project, and then simply create an array of AnimationResource. However, I'm trying to do it via the new Reality Composer Pro app because the docs state it's much more efficient versus individual files, but I'm having a heck of a time figuring out how exactly to do it. Do you have one scene per USDZ file (does that erase any advantage over just loading individual files)? One scene with multiple entities? Something else all together? If I try one scene with multiple entities within it, when I try to change animation I always get "Cannot find a BindPoint for any bind path" logged in the console, and the animation never actually occurs. This is with the same files that animate perfectly when just creating an array of AnimationResource manually via individual/raw USDZ files. Anyone have any experience doing this?
Posted Last updated
.
Post marked as solved
2 Replies
613 Views
I can only download usdpython from following website: https://developer.apple.com/augmented-reality/tools/ And where could I get the various versions of the usdpython? Is the usdzconvert(usdpython) of Apple open source? Because I wanna learn how Apple achieves the conversion from GLTF to USDZ because I'm currently using version 0.66, and I feel that the conversion of GLTF features is not quite sufficient.
Posted
by ClayQAQ.
Last updated
.
Post not yet marked as solved
2 Replies
1.4k Views
It seems that something must of changed in the reality composer export feature to USDZ. Importing any animated .usdz file into reality composer and then exporting it is reducing the playback frame rate to about 30%. The same file imported and then exported as a .reality file plays back just fine. Anyone else experiencing this issue, as its happening for every usdz file imported and also across 2 different apple laptops running the software?
Posted Last updated
.
Post not yet marked as solved
0 Replies
480 Views
Hello, I'm here. I am posting this in the hope that you can give me some advice on what I would like to achieve. What I would like to achieve is to download the USDZ 3D model from the web server within the visionOS app and display it with the Shared Space volume (volumetric) size set to fit the downloaded USDZ model. Currently, after downloading USDZ and generating it as a Model Entity, Using openWindow, The Model Entity is created as a volumetric WindowGroup in the RealityViewContent of the RealityView using openWindow. The Model Entity generated by downloading USDZ is added to the RealityViewContent of the RealityView in the View called by openWindow. The USDZ downloaded by the above method appears in the volume on visionOS without any problems. However, the size of the USDZ model to be downloaded is not uniform, so it may not fit in the volume. I am trying to generate a WindowGroup with openWindow using Binding with the appropriate size value set to defaultSize, but I am not sure which property of ModelEntity can be set to the appropriate value for defaultSize. The attached image does not have the correct position and I would like to place the position down if possible. I would appreciate your advice on sizing and positioning the downloaded USDZ to fit in the volume. Incidentally, I tried a plane style window and found that it displayed a USDZ Model Entity that was much larger in scale compared to the volume, so I have decided not to support a plane style window. If there is any information on how to properly set the position and size of the USDZ files created by visionOS and RealityKit, I would appreciate it if you could also provide it. Best regards. Sadao Tokuyama https://twitter.com/tokufxug https://1planet.co.jp/tech-blog/category/applevisionpro
Posted Last updated
.
Post not yet marked as solved
0 Replies
329 Views
The Apple documentation seems to say RealityKit should obey the autoplay metadata, but it doesn't seem to work. Is the problem with my (hand coded) USDA files, the Swift, or something else? Thanks in advance. I can make the animations run with an explicit call to run, but what have I done wrong to get the one cube to autoplay? https://github.com/carlynorama/ExploreVisionPro_AnimationTests import SwiftUI import RealityKit import RealityKitContent struct ContentView: View { @State var enlarge = false var body: some View { VStack { //A ModelEntity, not expected to autoplay Model3D(named: "cube_purple_autoplay", bundle: realityKitContentBundle) //An Entity, actually expected this to autoplay RealityView { content in if let cube = try? await Entity(named: "cube_purple_autoplay", in: realityKitContentBundle) { print(cube.components) content.add(cube) } } //Scene has one cube that should auto play, one that should not. //Neither do, but both will start (as expected) with click. RealityView { content in // Add the initial RealityKit content if let scene = try? await Entity(named: "Scene", in: realityKitContentBundle) { content.add(scene) } } update: { content in // Update the RealityKit content when SwiftUI state changes if let scene = content.entities.first { if enlarge { for animation in scene.availableAnimations { scene.playAnimation(animation.repeat()) } } else { scene.stopAllAnimations() } let uniformScale: Float = enlarge ? 1.4 : 1.0 scene.transform.scale = [uniformScale, uniformScale, uniformScale] } } .gesture(TapGesture().targetedToAnyEntity().onEnded { _ in enlarge.toggle() }) VStack { Toggle("Enlarge RealityView Content", isOn: $enlarge) .toggleStyle(.button) }.padding().glassBackgroundEffect() } } } No autospin meta data #usda 1.0 ( defaultPrim = "transformAnimation" endTimeCode = 89 startTimeCode = 0 timeCodesPerSecond = 24 upAxis = "Y" ) def Xform "transformAnimation" () { def Scope "Geom" { def Xform "xform1" { float xformOp:rotateY.timeSamples = { ... } double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:rotateY"] over "cube_1" ( prepend references = @./cube_base_with_purple_linked.usd@ ) { double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate"] } } With autoplay metadata #usda 1.0 ( defaultPrim = "autoAnimation" endTimeCode = 89 startTimeCode = 0 timeCodesPerSecond = 24 autoPlay = true playbackMode = "loop" upAxis = "Y" ) def Xform "autoAnimation" { def Scope "Geom" { def Xform "xform1" { float xformOp:rotateY.timeSamples = { ... } double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:rotateY"] over "cube_1" ( prepend references = @./cube_base_with_purple_linked.usd@ ) { quatf xformOp:orient = (1, 0, 0, 0) float3 xformOp:scale = (2, 2, 2) double3 xformOp:translate = (0, 0, 0) uniform token[] xformOpOrder = ["xformOp:translate", "xformOp:orient", "xformOp:scale"] } } } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
341 Views
I wanted to scan the person's face via a true depth camera generate a 3d face model and display the scene I didn't find any solutions. Has anybody integrated this type of functionality? Please help me out or provide a sample code to scan and preview the scanned face into sceneview or ARview
Posted
by Ashish20.
Last updated
.
Post not yet marked as solved
8 Replies
671 Views
We are trying to save usdz file in file manager some time its saved but some time we are getting the error. Like: path.absoluteURL file:///var/mobile/Containers/Data/Application/6D14A430-47B4-45F2-9D0D-6C31588A6A03/Documents/2896837C-C7E0-4FA8-BFE2-21A59B26D801.usdz Warning: in SdfPath at line 151 of sdf/path.cpp -- Ill-formed SdfPath </2896837CC7E04FA8BFE221A59B26D801>: syntax error Coding Error: in _IsValidPathForCreatingPrim at line 3338 of usd/stage.cpp -- Path must be an absolute path: <> cannotCreateNode(path: "/2896837CC7E04FA8BFE221A59B26D801") func saveFileLocal() { if let finalResult { let fm = FileManager.default var path = fm.urls(for: .documentDirectory, in: .userDomainMask).first! let fileName = "(UUID().uuidString).usdz" path.appendPathComponent(fileName) do { try finalResult.export(to: path.absoluteURL) } catch{ print(error) } } } func removeFiles() { var filePath = "" let fm = FileManager.default let path = fm.urls(for: .documentDirectory, in: .userDomainMask).first! do{ let content = try fm.contentsOfDirectory(atPath: path.path) for c in content{ filePath = path.appendingPathComponent(c).absoluteString if let url = URL(string: filePath){ try fm.removeItem(at: url) } } } catch{ print(error) } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
329 Views
With the latest software upgrades such Sonoma 14.0, Xcode 15.0 Beta 8 and Reality Composer Pro, what is the alternative to behaviours for a web based AR app for iOS 17? Any example, documentation or tutorial about how to use custom components to provide user interactivity such click events and be able to export the scene as USDZ?
Posted
by delcasda.
Last updated
.
Post not yet marked as solved
0 Replies
458 Views
I am using the RoomPlanExampleApp to export a USDZ. When I open the USDZ on Reality Converter, I get this error Invalid Structure in USDZ file The root layer of the package must be a usdc file and must not include any external dependencies that participate in stage composition.
Posted
by Mcberri.
Last updated
.