I suggest you use DispatchQueue to control the start time of the command, and use commands such as .move to control the action of the object.
In This Enumeration Code, When The User Meets Some Specific Conditions (I Have Not Written An Conditions Enumeration), The "Goose" (presume have Goose) Entity In Reality Composer Pro Will Be Moved 1 Second After 3 Seconds:
RealityView { content in
if let model = try? await Entity(named: "Scene_Name_HERE", in: realityKitContentBundle) {
content.add(model)//For show your scene
if isUserConditions { //I Have Not Written An Conditions Enumeration
//When The User Meets Some Specific Conditions, Do This:
DispatchQueue.main.asyncAfter(deadline: .now() + 3) {//wait 3 sec
model.findEntity(named: "Goose")?.move(to: [Transform_Postion_Data_HERE], relativeTo: model.findEntity(named: "Goose")?, duration: 1.0)//Goose Move
}
}
}
}
Hope it will be helpful to you!
Note: This is just a Enumeration, so the code may not be used directly or have errors.
Post
Replies
Boosts
Views
Activity
Rest assured, for distant entities, they will not perceive it unless users undertake an extended journey in a highly vacated location. Naturally, ordinary users will not engage in such behavior.👀
Furthermore, the immersive environment of visionOS may automatically render the scene transparent upon the user’s walking a predetermined distance for security purposes. However, this setting should be adaptable based on the user’s surroundings.
You can try to use Room Tracking:
https://developer.apple.com/documentation/visionos/building_local_experiences_with_room_tracking
The position of the window cannot be modified through programming in VisionOS. For your requirements, I suggest you consider using immersive space to achieve the desired effect. This approach allows the window to move in sync with the user’s movements. To track the user’s head, you can utilize the anchorEnity feature in RealityView.
In normal times, your info won’t be gone, but it’s a good idea to save it just in case. The Beta system is a bit wonky, and if it crashes, you might lose some stuff you can’t get back.
In addition: I found the problem. There is no download option for Predictive Code Completion in the Compone in my Xcode settings. Text edit tells me that my country does not support it. I am in China, and the machine was also bought in China. Is the error of Xcode wrong or is it really not available in China? If so, can I solve it through VPN or other methods?
I asked Apple a similar question a few days ago, that is, if the anchored real object is moved, it can be followed, and their reply is that it can be followed as long as there is no large area of occlusion and the movement speed is not too fast. So for your problem, as long as the user's hand does not cover the globe too much and the rotation speed is not too fast, it should be able to follow normally.
@sadaotokuyama
I found that this problem is not a problem with the code, but a problem with the USDZ model, because when I use the code you gave me to display the USDZ model you provided on GitHub, I can play the animation normally, but when I use the same code to display my own model, it does not It can play animations. I hope you can tell me how to solve it, thank you!