The version of Reality Composer Pro is 1.0 from Xcode 15 beta 2. Every time I click 'Particle Emitter' button, it will crash. I can't open the Diorama, the demo from document, neither.
Reality Composer Pro
RSS for tagLeverage the all new Reality Composer Pro, designed to make it easy to preview and prepare 3D content for your visionOS apps
Posts under Reality Composer Pro tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi,
I would like to learn how to create custom materials using Shader Graph in Reality Composer Pro. I would like to know more about Shader Graph in general, including node descriptions and how the material's display changes when nodes are connected. However, I cannot find a manual for Shader Graph in Reality Composer Pro. This leaves me totally clueless on how to create custom materials.
Thanks.
Sadao Tokuyama
https://1planet.co.jp/
https://twitter.com/tokufxug
When I create an USDZ file from the original Reality Composer (non pro) and view it in the Vision OS simulator, the transforms and rotations don't look similar.
For example a simple Tap and Flip behaviour does not rotate similar in Vision OS.
Should we regard RC as discontinued sw and only work with RC-pro?
Hopefully Apple will combine the features from the original RC into the new RC pro !
I been playing around with VisionOS and currently trying to play with Audio, but somehow following the steps in wwdc isn't working for me.
what I'm trying to do is to play audio when you navigate to a certain view.
Currently in wwdc, they're doing this to programmatically set up audio
let audioSource = Entity()
audioSource.spatialAudio = SpatialAudioComponent(directivity: .beam(focus: 0.75))
audioSource.orientation = .init(angle: .pi, axis: [0, 1 , 0])
if let audio = try? await AudioFileResource(named: "audioFile",
configuration: .init(shouldLoop: true)) {
}
But I'm getting these errors and cannot proceed
'AudioFileResource' cannot be constructed because it has no accessible initializers
I couldn't quite follow on how to do it from Reality Composer pro either.. therefore I'm stuck.
Any help is really appreciated.
@Apple Really excited by the updates on the Reality Composer and Object Captures side - is it possible to publish the sample Apps and project material referenced in both the Reality Composer Pro sessions as well as the "Meet Object Capture for iOS" sessions. Looks like it hasn't yet been updated to the Sample Code site.
Thank you!
https://developer.apple.com/videos/play/wwdc2023/10273/
https://developer.apple.com/videos/play/wwdc2023/10191/
Hi Folks, I have two question if you can help me, first is as I tested none of env trackers work in simulator I know Reality kit needs to get data from LIDAR or camera to detect env and depth the question is: the env in xcode env is a just a HDR image? means we can not make any app for VisionPro until we get a device? as I watch Apple videos all are using real device and none of them are in xcode simulation when tracking world.
Second question is are we able to add light or remove default light from scene? whatever I did it doesn't have any effect on my scene. I'm 3d model maker too with Blender so I can understand the graphs in composer pro but many of those doesn't reflect any effect.
Thank you so much!
Hi folks,
I have an M1 iMac (A new mac) which is taking long time to render a model while hot load. Even I tried with the i5 Quad core mac which is also taking long than the M1. Can anyone suggest which is the recommended Mac system configuration to develop Vision Pro app?
Very often when I try to load an Entity in RealityView the system give me an error:
The operation couldn’t be completed. (Swift.CancellationError error 1.)
RealityView { content in
do {
let scene = try await Entity(named: "Test_Scene", in: realityKitContentBundle)
content.add(scene)
} catch {
debugPrint("error loading scene", error.localizedDescription.debugDescription)
}
}
The scene I created contain a basic square
This is the main file
struct first_app_by_appleApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}.windowStyle(.volumetric)
}
}
and inside info.plist file I setup UIApplicationPreferredDefaultSceneSessionRole with UIWindowSceneSessionRoleVolumetricApplication
I think that could be a bug of the system but I'm not surprised since it's the first beta. If so, do you know any workarounds?
If you have a scene with a simple custom .usda material applied to a primitive like a cube, the exported (.usdz) material definition is unknown for tools like Reality Converter Version 1.0 (53) or Blender Version 3.6.1.
Reality Converter shows up some warnings "Missing references in USD file", "Invalid USD shader node in USD file".
Even Reality Composer Pro is unable to recreate the material correct with it's own exported .usdz files.
Feedback: FB12699421
I'm trying to load a model of 200mb which has tris more than .2 million. Could anyone tell the tris makes an issue loading the model in the reality composer pro?
Hello,
I am writing to you today to inquire about my application for the Apple Vision Pro Developer Kit. I submitted my application three times, but each time the application page displayed the text "%%USE_I18N_STRING_ERROR%%". After I submitted my application, the page I was redirected to said "We'll be back soon". I am not sure if my application was successful. I have not received any email confirmation.
Would you be able to check the status of my application and let me know if there is anything else I need to do?
Thank you for your time and consideration.
Sincerely,
Sadao Tokuyama
https://twitter.com/tokufxug
https://www.linkedin.com/in/sadao-tokuyama/
Hello community, I'm trying to open a window with larger than the original initial dimensions. Because the native Freeform app doing it just fine with wider-than-usual window dimensions.
But I found on the documentation saying that "In visionOS, all windows open with the same initial dimensions." I tried to set a greater width on the window frame but there indeed seems to be a hard fixed max length on it since I can make the window to be smaller but not bigger programmatically.
Am I doing something wrong or missing anything?
What are your thoughts on how the larger window of Freeform even gets implemented? Or it could even not be a window for example might be a layered on a large reality view?
Any input would be appreciated!
When using Xcode 15 Beta 5, a project created with Xcode 15 Beta 3 seemingly fails to open any scene within the RealityKitContentBundle anymore while a newly created one works flawlessly.
How can I reregister my bundle?
Was going through the video "Meet Reality Composer Pro" video (thanks Eric!) and while working through the Particle Emitter section, found that the preset sections was missing.
When clicking the presets entry point (icon with 6 squares), a strange UI thing shoots up near the position x axis box (near the mouse cursor in the screenshot)
Am using Version 1.0 (372.2.11) of Reality Composer Pro. Thanks so much!
Hello,
I have created a material in RCP and applied it via "Material Binding" to 3D geometry within my scene... however, when I export it out as a USDz file and open in Xcode... the material changes have not persisted.
Any suggestions on why this is so?
Hello, how does one add the capability for a user to rotate an object inside the Vision OS?
I have tried to add a DragGesture to this to no avail, it will not register.
There must be something native we utilize to allow a user to manipulate a 3D Model.
This is how I am putting the model into the scene:
let loadedEntity = try await Entity.load(named: itemName in: RealityKitContent.realityKitContentBundle)
loadedEntity.generateCollisionShapes(recursive: true)
content.add(loadedEntity)
I have a .stl in “Standard Tesselated Geometry File Format” - how do I get this into a format which VisionOS can use?
Notes:
double-clicking on the .stl launches Xcode and shows the Scene graph
I’ve added it to a template VisionOS app. It comes in with a gray icon that is otherwise identical to the blue icon that the template creates (named “Scene”) - but I cannot figure out how to get the model contained in the .stl to be recognized.
Reality Composer Pro will not open the .stl file
Many thanks!
Is there support for Blend Shapes in RealityKit? I don't see controls for it in the API and the Blend Shapes that should be on my model aren't appearing in Reality Composer Pro
Hi, in this video: at approx 11:24, a checkbox called "Collisions" is found on the selected ParticleEmitter (right at the bottom).
I can't seem to find it in my build of Reality Composer Pro and can't seem to find another way to allow for collisions on the particles of the ParticleEmitter.
I'm on Version 1.0 (385.2) on RCP.
Any help on this would be great.
Thanks.
Hi. Given an RC entity
var physicsBody = tappedObject.entity.components[PhysicsBodyComponent.self]! as PhysicsBodyComponent
how does one call
func applyImpulse(_ impulse: SIMD3, at position: SIMD3, relativeTo referenceEntity: Entity?)
to impose an impulse on the object?
Thanks!