Hello!
I would like to do exactly this:
https://youtu.be/Cun8K7ctKp0?si=TgWvtdw-VdlBVL0R
I can't seem to find any documentation on getting a PS5 controller hooked up properly in Reality Composer Pro and Xcode, driving a character with animation (or, moving an object around freely) then over to the Vision Pro.
Additionally, I would also like to learn how to use a controller to move a VR camera around a scene, so that we can navigate in custom built spaces - similar to Meta virtual environments, or Steam VR home environments.
Typically, I would do this with Unreal Engine, but unfortunately, AVP support is still in its infancy there. So I figured, why not try to do it natively?
Any help with concrete tutorials or documentation would be greatly appreciated.
Thx!
Reality Composer Pro
RSS for tagPrototype and produce content for AR experiences using Reality Composer Pro.
Post
Replies
Boosts
Views
Activity
If I create a visionOS app project, it automatically creates a RealityKitContent package. However, if I create a Multiplatform project (for visionOS, macOS, and iOS), the package is not added. Is it possible to add it manually? How? (using Xcode 16.1 beta 2)
I saw onnoffitacation in the Behavior configuration of Reality Composer pro, which asked me to enter the Nofficatition name, that is to say, this requires swift in Xcode to send a message. There is a message name in the message, so I hope you can write a list for me how to use Swift in Xcode to send a message containing the message name.(There is an answer in https://developer.apple.com/forums/thread/756978, but it doesn't work.)
and in the time line in Reality Composer Pro, there is a Notification action, which is used to send messages to swift. How can I ask swift to detect whether the Notification action has sent a message?(There is an answer in https://developer.apple.com/videos/play/wwdc2024/10102/, but it doesn't work.)
I have asked this question before (https://developer.apple.com/forums/thread/756978). Those answers were available before, but now they are all invalid in the latest system. I hope you can help me. Thank you.
Given my limited knowledge of physics, I would appreciate it if individuals with a solid understanding of the subject could provide insights into this matter. I have added a physical component to a entity in Reality Composer Pro, but I am seeking guidance on how to achieve the following:
Make an object float in the air (with a slight downward motion reminiscent of the moon’s surface)
Enable the object to move at a slow pace
Implement a strong rebound force
I would be grateful if you could provide appropriate values for these parameters. Thank you for your assistance.
Hello, I am getting following error on console and my app crashes. It goes to dark and then Apple logo appears and app crashes
apply fence tx failed (client=0x61dbbfd7) [0xfffffecc (ipc/mig) server died]
[C:3] Error received: Connection interrupted.
Failed to commit transaction (client=0x94097449) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xe9684b50) [0x10000003 (ipc/send) invalid destination port]
[C:3-1] Error received: Connection interrupted.
Failed to commit transaction (client=0xbcac17e9) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x52392119) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xff841d17) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xdef5c915) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xefdc8bf3) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xd50c1eff) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x15690a46) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0xf296f56b) [0x10000003 (ipc/send) invalid destination port]
Failed to commit transaction (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_read_request_report [C1] Receive failed with error "No message available on STREAM"
nw_protocol_socket_reset_linger [C1:2] setsockopt SO_LINGER failed [22: Invalid argument]
apply fence tx failed (client=0x61dbbfd7) [0x10000003 (ipc/send) invalid destination port]
Failed to set override status for bind point component member.
Failed to set override status for bind point component member.
Failed to set override status for bind point component member.
Message from debugger: Terminated due to signal 9
Could you please tell me what's the reason and how can I resolve this. When I loads 2,3 times then app works fine from that point onwards. But this happens time to time when debug.
Hello! I am working on some cool project reconstructions for a client. They design lobby sized installations with LED walls. I am being tasked with converting these over to AR at scale. I've got my first test in headset and it looks great! However, the desire to just walk around more than the 10' x 10' safe area zone totally takes one out of the immersive VR experience - which is pretty counter intuitive. Is there any way for us developers to by pass this hard limit, so that clients who are requesting more room-scale options can actually enjoy this in VR?
Alternatively, is there a way to hook up a PS5 controller to a "player start" so I can navigate inside the VR volume? I'm really trying to embrace Reality Composer Pro, but it seems extremely limiting as I wait for Unreal Engine to get its act together. sigh. Thanks for any help or suggestions.
In RealityKit, I know that an HDR image is pre-calculated, and through the settings of the ImageBasedLight Component, a specified specular object can reflect the content of the HDR image.
If a mirror object is originally very large, such as a large-area continuous glass door, after specifying an IBL image for these glass doors, the image reflected by the mirror will be obviously deformed when it moves in space. Because IBL is a picture of the surrounding environment at a point, while the glass door is a surface.
Is there a truly real-time specular reflection calculation setup in RealityKit that can reflect the model on the opposite side of the glass door?
I have no clue at all. Any suggestions are welcome. Thank you.
How to create fire effect in Reality Composer Pro? Should I use particle emitter?
Behavior:
Orbit animation doesn't show up. Both for OnTap trigger and OnAddedToScene trigger. It is not an issue with my code because I tested with an emphasize float animation and it works perfectly.
Environment:
ARKit + RealityKit, iOS18
My animation timeline settings:
A simple Orbit animation block with a target, a pivot entity. 1s duration, orbit direction clockwise, axis(0,1,0), 1 revolution, and blend layer 300.
My Behavior setting:
OnTap -> play the animation
TLDR: Timeline does not play animation when Repeat Forever is checked.
Hi! I have created a timeline for my model that does a built-in emphasize animation. Then I added a behavior to my model and has set OnAddedToScene with action to run that timeline. It works perfect well on my device. But I want the timeline to be looped. I realized that there's no loop option in the timeline, but I noticed that I can loop it if I insert it into another timeline(The loop checkbox shows up). So I did that and had my model's behavior to run that timeline. But then the model doesn't play the animation as intended.
Note: I am not making a VisionPro app, but an iOS app leveraging ARKit and RealityKit
Environment: iPhone 13 Pro Max with iOS18.0
Code:
struct ARViewContainer: UIViewRepresentable {
func makeUIView(context: Context) -> ARView {
let arView = ARView(frame: .zero)
arView.session.run()
Task {
do {
let anchor = AnchorEntity(plane: .horizontal)
let emojiScene = try await Entity(named: "SunglassesScene", in: bubbleAR
anchor.addChild(emojiScene)
arView.scene.addAnchor(anchor)
} catch {
print("Failed to load models: \(error)")
}
}
return arView
}
}
Thank you!
The 3D furniture model I built uses some smooth specular reflection materials. I hope to only reflect the HDR image of the ImageBasedLight component I set myself, without reflecting the light source of the AR real environment. How to achieve this in the following scenario?
How to avoid being affected by the light source of the AR real environment when using PBR materials
When using Shader Graph, how can EnvironmentRadiance not be affected by the light source of the AR real environment?
A ShaderGraphMaterial with an Occlusion Surface Output generated with RealityComposer 2 fails to load on iOS 18 and macOS 15 with the following error:
RealityFoundation.ShaderGraphMaterial.LoadError.invalidTypeFound (https://developer.apple.com/documentation/realitykit/shadergraphmaterial/loaderror/invalidtypefound)
This happens with both https://developer.apple.com/documentation/shadergraph/realitykit/occlusion-surface-(realitykit) and https://developer.apple.com/documentation/shadergraph/realitykit/shadow-receiving-occlusion-surface-(realitykit)
RealityView { content in
do {
let bgEntity = ModelEntity(mesh: .generateCone(height: 0.5, radius: 0.1), materials: [SimpleMaterial(color: .red, isMetallic: true)])
bgEntity.position.z = -0.2
content.add(bgEntity)
let occlusionMaterial = try await ShaderGraphMaterial(named: "/Root/OcclusionMaterial", from: "OcclusionMaterial")
let testEntity = ModelEntity(mesh: .generateSphere(radius: 0.4), materials: [occlusionMaterial])
content.add(testEntity)
content.cameraTarget = testEntity
} catch {
print("Shader Graph Load Error:")
dump(error)
}
}
.realityViewCameraControls(.orbit)
.edgesIgnoringSafeArea(.all)
Feedback ID: FB15081296
The ShaderGraph Node Blurred Background (RealityKit) – https://developer.apple.com/documentation/shadergraph/realitykit/blurred-background-(realitykit) works fine within the RealityComposer Pro 2 editor but isn't working on iOS 18 or macOS 15. Instead of the blurred content it just renders as opaque in a single color (Screenshot 2).
Interestingly it also fails to render within RealityComposer Pro when no other entities are within the scene, e.g only a background skybox set.
Expected Behavior: It would be great if this node worked the same way as it does on visionOS since this would allow for really interesting and nice effects for scenes.
Feedback ID: FB15081190
Is there a good way to have an app open scenes from Reality Composer Pro, without that scene being part of the app? Kind of like you would browse any other file.
I have a business where I provide walk throughs in building plans. I do this by building an app containing a Reality Composer Pro scene containing the customers building.
For each customer I will duplicate the app and change the Scene content. (One app per customer)
In order to scale my business I would love to be able to distribute the scenes to the customers so they could just open them in the app, but I don't see how to to that.
If you have any idea as to how this can be done it would be great!
I have very little experience in coding, so please assume I don't know what you are talking about when explaining
I am trying to use onNofitication in BehaviorComponent to fire up my composed timeline actions. Which is formed up by one TransformTo action, one Hide action and followed by a Notification action indicating the other two actions are finished.
With this post, I successfully send a notification to RCP to fire up my timeline with identification:
NotificationCenter.default.post(
name: NSNotification.Name("RealityKit.NotificationTrigger"),
object: nil,
userInfo: [
"RealityKit.NotificationTrigger.Scene": scene,
"RealityKit.NotificationTrigger.Identifier": "onSomethingStart"
]
)
On the other hand, to subscribe that Notification Action, I append a onReceive function below my RealityView, and succesfully received my notification
private let notificationTrigger = NotificationCenter.default.publisher(
for: Notification.Name("RealityKit.NotificationTrigger"))
guard let entity = out.userInfo?["RealityKit.NotificationTrigger.SourceEntity"] as? Entity,
let notificationName = out.userInfo?["RealityKit.NotificationTrigger.Identifier"] as? String else { return }
debugPrint("Received notification: \(notificationName), entity name: \(entity.name)")
Which means that my Timeline is fired up because I can received my notification in my Timeline.
But the rest two actions just don't appear to be working. I played the timeline in RCP it works fine.
Anything I missed to make it tick?
XCode beta 16.1
VisionOS beta 9
I want to use SIMD values with a design time component.
public struct SomeComponent: Component, Codable {
public var Magnitude: SIMD3 = .zero
}
Is extra work required? I had understood that serialization of simple values including SIMD would be handled by Reality Composer Pro.
At run time I get the error:
decodeComponent: Unexpected error: keyNotFound(CodingKeys(stringValue: "Magnitude", intValue: nil), Swift.DecodingError.Context(codingPath: [], debugDescription: "No value associated with key CodingKeys(stringValue: "Magnitude", intValue: nil) ("Magnitude").", underlyingError: nil))
Asset deserialization failed. Asset type "SceneAsset". Details: Failed to deserialize "/container/@shared/17/object". Reason: Failed to deserialize Swift Codable component of type RealityKitContent.SomeComponent.
VisionOS integrates SwiftUI interface in Unity, but the system drag bar at the bottom of the interface is too far away from the interface. How to solve this problem?
My visionOS app has over 150 3D models that work flawlessly via firebase URL. I also have some RealityKitContent scenes (stored locally) that are getting pretty large so I'm looking to move those into firebase as well and call on them as needed. I can't find any documentation that has worked for this and can't find anyone that's talked about it. Does anyone know if this possible?
I tried exporting the Reality Composer Pro scenes as USDZ's and then importing them in but kept getting material errors. Maybe there's a way to call on each 3D model separately and have RealityKit build them into the scene? I'm not really sure. Any help would be much appreciated !
RealityContentKit bundle resource issue
Recently I always encounter weird loading bugs from RealityKitContent bundle. When I was trying to load audio resource as AudioFileResource or AudioFileGroupResource by loading from *.usda from RealityKitContent bundle, with this method. My code is nothing complicated but simple as below:
let primPath: String = "/SampleAudios/SE_bounce_audio"
guard let resource = try? AudioFileGroupResource.load(named: primPath, from: "MyScene.usda", in: realityKitContentBundle) else {
return
}
And the runtime program "sometimes"(whenever I change something RCP it somethings work again but the behavior is unpredictable) reports that it "Cannot find MyScene.usda:/SampleAudios/SE_bounce_audio in RealityKitContent.bundle".
I put MyScene.usda under the root folder of RealityKitContent package because I found that RealityKit just cannot find any *.usda scene if you didn't put that on the root level (could be a bug because of the way it indexes its files).
I even double checked my .usda file with usdview, the primPath is absolutely correct. I think there are some unknown issues when RealityKitContent copy resources and build the package. I tried to play with the package Package.swift file a bit to see if I could manually copy my resources (everything) and let the package carry my resources but it just didn't work. So right now I just keep this file untouched below (just upgrade the swift-tools-version to 6.0 as only that can supports .visionOS(.v2)):
// swift-tools-version:6.0
// The swift-tools-version declares the minimum version of Swift required to build this package.
import PackageDescription
let package = Package(
name: "RealityKitContent",
platforms: [
.visionOS(.v2)
],
products: [
// Products define the executables and libraries a package produces, and make them visible to other packages.
.library(
name: "RealityKitContent",
targets: ["RealityKitContent"]),
],
dependencies: [
// Dependencies declare other packages that this package depends on.
// .package(url: /* package url */, from: "1.0.0"),
],
targets: [
// Targets are the basic building blocks of a package. A target can define a module or a test suite.
// Targets can depend on other targets in this package, and on products in packages this package depends on.
.target(
name: "RealityKitContent"
),
]
)
That is just issue one, RealityKitContent package build issue.
Audio file format issue
Another is about Audio File Format RCP supports. I remember is a place (WWDC?) saying .wav and .mp4 are supported to be used as audio source. But when I try to set up Spatial Audio, I find sometimes *.wav or *.mp3 can also be imported as AudioSourceFile. But the behavior is unpredictable. With two *.wav files SE_ball_hit_01.wav and SE_ball_hit_02.wav, only SE_ball_hit_01.wav is supported, 02 is reported as the format is not supported/ Check out my screenshots to see the details of two files. Two files have almost the same format (same sample rate or channel).
I understand there might be different requirements for a source file to be used as Spatial or Ambient audio. But I haven't figured that out or there is nothing I can find helpful on Apple Documentation. So what is the rules?
Thanks for reading and any thought is welcomed.