I have a AVPlayer() which loads the video and places it on the screen ModelEntity in the immersive view using the VideoMaterial. This also makes the video untappable as it is a VideoMaterial.
Here's the code for the same:
let screenModelEntity = model.garageScreenEntity as! ModelEntity
let modelEntityMesh = screenModelEntity.model!.mesh
let url = Bundle.main.url(forResource: "<URL>",
withExtension: "mp4")!
let asset = AVURLAsset(url: url)
let playerItem = AVPlayerItem(asset: asset)
let player = AVPlayer()
let material = VideoMaterial(avPlayer: player)
screenModelEntity.components[ModelComponent.self] = .init(mesh: modelEntityMesh, materials: [material])
player.replaceCurrentItem(with: playerItem)
return player
I was able to load and play the video. However, I cannot figure out how to show the player controls (AVPlayerViewController) to the user, similar to the DestinationVideo sample app.
How can I add the video player controls in this case?
Reality Composer
RSS for tagPrototype and produce content for AR experiences using Reality Composer.
Posts under Reality Composer tag
45 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm a newbie to Reality Composer. Have constructed a couple of simple projects
Have added text, and the "A a" is displayed, but as soon as it is selected for editing Properties, Xcode crashes.
Running Ventura 13.3.1, and Reality Composer 1.5, Xcode 14.3.1.
Will appreciate guidance on how to fix.
Thanks!
Hi everyone, is it possible to use a 3D USDZ file to train a model in Create ML, I see there is an image option but it would be good to use these files from Reality Composer from object capture? Or is this in the works for forthcoming Xcode updates? Many Thanks Stuart
I've got a couple 2D PNG assets that I want to add to a scene made of a couple other udsz files in RCP (picture adding a couple 2D videogame characters to a simple 3D diorama).
When I try to drag the PNGs to the workspace or the file tree…nothing happens.
I found a walkthrough on Medium (called "Importing and Exporting Personalized Objects for Augmented Reality: Reality Composer and SwiftUI" for those curious as I can't link to Medium posts here) that makes it look like users could do this with simple drag-and-drop. The Medium post is from June 2023, and in the screenshots RCP visually looks a lot more like Reality Composer on iPad, so I'm assuming it's changed a lot since then?
Is there still a way to do this? I've tried adding the 2D elements to a scene with Blenders "import images as planes," but I'm getting weird halos around them and was hoping RCP could make the process a bit easier/cleaner.
I have a custom material in Reality Composer.
When I attach it to a cube and try loading the scene in XCode, the material cannot be cast to a ShaderGraphMaterial because it has been changed to a PhysicallyBasedMaterial.
The material was always a Custom material, I did not change the type in Reality Composer.
Does anyone know how to fix?
I'm following the Meet Reality Composer Pro walkthrough and ran into something that didn't function as expected.
When I got to the step where I add five "Bird_With_Audio.usda" references to the scene, I found they did not play audio. After some trial and error, I found that Preview > Resource in each of their Spatial Audio items was set to "None." If I click the dropdown menu, I see several "Bird_Calls" groups to pick from.
I checked the original Bird_With_Audio.usda that I had created, and the "Bird_Calls" audio group was correctly assigned and worked. I tried dragging a sixth Bird_With_Audio into the scene and confirmed that the Spatial Audio item suddenly empties, rendering the bird silent.
I was able to go through each of the five birds and set their Spatial Audio Resource to Bird_Calls, and the group worked like the video demonstrates.
While this fixed the issue, as a beginner I'd like to know why this happened. It doesn't seem right that I would build and item and then have to re-attach any sounds to it when I place it in the main scene. So…where did I mess up?
I have been digging into learning shader graphs by watching Unity shader graph content, cause lots of the same concepts apply.
One thing I noticed was that in Unity, each node in the shader graph has a little preview. I don't think this exists in Reality Composer Pro, but is there anyway to mimic it (like can I hook up a node that allows me to debug the graph at that point?)
If not, I'm happy to just file a feedback about it, but just thought I'd ask!
I am very new to shaders, never used one of the large systems like Unity. However I have started exploring visionOS programming and that led me to create some effects for materials in Reality Composer Pro.
I have been overwhelmed with the possibilities, but also kind of lost. I understand that RCPs shaders are based on MaterialX, so maybe there are tutorials on the web that would cover how to create procedural effects (fire, wind, water, etc)? I’ve stumbled through…but it’s slow going. Are there any good resources that talk about how to use the various nodes to create procedural effects?
For example, it took me a while to figure out that using the “time” node allows me to animate cool color changes, especially when combined with various math and remap nodes.
Just looking for some basic resources I think. Would the shader graph tutorials about Unity, apply to using RCP? Are the node types similar enough?
I'm working on a project wherein RealityKit for iOS will be used to display 3D files (USDZ) in a real-world environment. The model will also need to animate differently depending on which button is pressed. When using models that are downloaded from various websites or via Apple QuickLook, the code functions well. I can hold the animation in place and click a button to play it.
Unfortunately, although the model (through blender) my team provided is animating in SceneKit, it does not play at all when left in the real world, not even when a button is pressed.
I checked RealityKit USDZ tool, and found usdz file is not valid, they are not figure out whats wrong.
Could you please help me figure out what's wrong with my USDZ file?
Working USDZ: https://developer.apple.com/augmented-reality/quick-look/models/drummertoy/toy_drummer_idle.usdz
My file: https://drive.google.com/file/d/1UibIKBy2fx4q0XxSNodOwQZMLgktKiKF/view?usp=sharing
Dear Apple Developer Forum Community,
I hope this message finds you well. I am writing to seek assistance regarding an error I encountered while attempting to create a "Hello World" application using Xcode.
Upon launching Xcode and starting a new project, I followed the standard procedure for creating a simple iOS application. However, during the process, I encountered an unexpected error that halted my progress. The error message I received was [insert error message here].
I have attempted to troubleshoot the issue by see two images, but unfortunately, I have been unsuccessful in resolving it.
I am reaching out to the community in the hope that someone might have encountered a similar issue or have expertise in troubleshooting Xcode errors. Any guidance, suggestions, or solutions would be greatly appreciated.
Thank you very much for your time and assistance.
Sincerely,
Zipzy games
y
Games
Is it possible to render an SF Symbol to a plane in RealityKit? As an example, render it as a 2D image without any depth, like a plane?
I thought of MeshResource and generateText but you cannot interpolate SF Symbols (their literal text interpretation displays instead).
Perhaps rendering it as a texture on a plane? Any thoughts?
I'm trying to import the USDZ file of a model with multiple textures attached to each part of the model. When I preview the file by double-clicking on the USDZ, it views fine.
However, when I import it into Reality Composer Pro, it only shows the pink striped model.
I also get the message - "Multiple root level objects exist for HU_EVO_SPY-8.usdc".
There are so many components of the model that binding each texture to each component will be very difficult to do manually.
How can I fix the file such that when I import to Reality Composer Pro, textures are attached to the model?
Activity monitor reports that Reality Composer Pro uses 150% CPU and always is the number one energy user on my M3 mac. Unfortunately the high cpu usage continues when the app is hidden or minimized. I can understand the high usage when a scene is visible and when interacting with the scene, but this appears to be a bug. Can anyone else confirm this or have a workaround?
Can the scene processing at least be paused when app is hidden?
Or better yet, find out why the cpu usage is so high when the scene is not changing.
Reality Composer Pro Version 1.0 (409.60.6) on Sonoma 14.3
Thanks
I would like to apply different textures to the front and back faces of a 3D material. Specifically, when applying a texture that cuts the object in half through opacity, I want to be able to observe the back face of the object and apply a different color to it compared to the front face.
In Unity, there is a 'isFrontFace' boolean node that allows for applying different colours to the front and rear faces. However, I am unsure of how to achieve the same effect in Reality Composer Pro!
3D Model is already two-sided.
I create a simple primitive shape and I want to add colors on each faces of the cube. I was thinking using Shape Graph, but I have no idea on how to specify each faces with a different color. Any lead or help would be great. This tech is new so help documentions is very low
I just updated xcode 15.2 and I want to try to use Reality Composer Pro, I saw on the Apple developer video that it should be under Xcode -> developer tool -> Reality Composer Pro but when I open that I don't have Composer.
On the Apple webpage for Rality Composer is written "Reality Composer for macOS is bundled with Xcode, which is available on the Mac App Store."
Where i can find the Composer Pro?
Thanks
iOS 17 Safari will not successfully open .reality (Reality Composer) files from a weblink.
For example, this code works fine on iOS 16:
The same code generates this message on iOS 17 after clicking on the link:
"Object requires a newer version of iOS." QuickLook fails to render anything.
I validated that the .reality file works fine when opened from iOS 17 Files app, so it's not a damaged file.
I have various .reality files published on a website as part of a learning product, which I deployed Feb. 2023 using the latest Reality Composer at the time.
Users informed me that none of the .reality files will open on iOS 17, which I have confirmed. They still open fine on iOS 16.
On iOS 17 the QuickLook viewer says "Object requires a newer version of iOS."
What gives? Did Apple deprecated .reality, or are these designed only to work on one version of iOS only?
Hello I am very new here in the forum (in iOS dev as well). I am trying to build an app that uses 3d face filters and I want to use Reality Composer. I knew Xcode 15 did not have it so I downloaded the beta 8 version (as suggested in another post). This one actually has Reality Composure Pro (XCode -> Developer tools -> Reality Composure Pro) but the Experience.rcproject still does not appear. Is there a way to create one? When I use Reality Composure it seems only able to create standalone projects and it does not seem to be bundled in a any way to xCode. Thanks for your time people!
In my Reality Composer scene, I have added a spatial audio. How do I play this from my swift code?
I loaded the scene the following way:
myEntity = try await Entity(named: "grandScene", in: realityKitContentBundle)