In RealityView, I use AnchorEntity to lock to Wall to run normally:
AnchorEntity(.plane(.vertical, classification: .wall, minimumBounds: .one))
But I use a location other than wall but I can't display View. Why is that?
Post
Replies
Boosts
Views
Activity
In visionOS project, How does the model in RealityView track a certain part of the body, such as left foot, right leg, left arm, etc., and store the position information of the model moving in the following part in a variable (in order to realize the transmission of body part information in the call).
Please let me know if Apple publishes a document about the Persona virtual camera.
import SwiftUI
import TipKit
struct ChatRoomView: View {
@StateObject private var socketManager = SocketIOManager()
@State private var inputText: String = ""
@StateObject var viewModel = SignInWithAppleViewModel()
@Binding var isCall: Bool
@State private var isSheet = false
@State private var ShowView = false
var learnlisttip = KeyTip()
@Binding var showShareSheet: Bool
@Binding var codeshar: String
var body: some View {
NavigationStack{
VStack {
if let roomCode = socketManager.roomCode {
ZStack{
VStack{
HStack{
Text("Room Key: \(roomCode)")
.font(.title)
.onAppear{
codeshar = roomCode
self.isCall = true
}
Button(action:{
self.showShareSheet = true
}, label:{
Image(systemName: "square.and.arrow.up.fill")
.accessibilityLabel("Share")
})
}
.padding(20)
TipView(learnlisttip, arrowEdge: .top)
.glassBackgroundEffect()
.offset(z: 20)
Spacer()
}
List(socketManager.messages, id: \.self) { message in
Text(message)
}
TextField("input", text: $inputText)
Button("send") {
socketManager.sendMessage(roomCode: roomCode, message: inputText)
inputText = ""
}
}
.sheet(isPresented: $showShareSheet) {
let shareContent = "Open SpatialCall, Join this Room, Key is: \(codeshar)"
ActivityView(activityItems: [shareContent])
}
} else {
HStack{
Button(action:{
withAnimation{
socketManager.createRoom()
}
}, label: {
VStack{
Image(systemName: "phone.circle.fill")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive: !ShowView)
.font(.largeTitle)
Text("Add Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(60)
Button(action:{
withAnimation{
self.isSheet = true
}
}, label: {
VStack{
Image(systemName: "phone.badge.checkmark")
.symbolRenderingMode(.multicolor)
.symbolEffect(.appear, isActive:!ShowView)
.font(.largeTitle)
Text("Join Room")
.font(.title3)
}
})
.buttonStyle(.borderless)
.buttonBorderShape(.roundedRectangle)
.padding(.horizontal, 30)
.glassBackgroundEffect()
.offset(z: 20)
.scaleEffect(1.5)
.padding(70)
}
}
}
.onAppear {
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
withAnimation {
self.ShowView = true
}
}
}
.sheet(isPresented: $isSheet){
VStack{
Text("Join Room")
.font(.largeTitle)
Text("You need to get the key to the room.")
TextField("Key", text: $inputText)
.padding(30)
.textFieldStyle(.roundedBorder)
Button(action:{
socketManager.joinRoom(roomCode: inputText)
self.isSheet = false
}, label: {
Text("Join Room")
.font(.title3)
})
.padding(50)
}
.padding()
}
.sheet(isPresented: $socketManager.showRoomNotFoundAlert) {
Text("The room does not exist. Please check whether the Key you entered is correct.")
.font(.title)
.frame(width: 500)
.padding()
Button(action:{
self.socketManager.showRoomNotFoundAlert = false
}, label: {
Text("OK")
.font(.title3)
})
.padding()
}
}
}
}
In the above code (this is a visionOS project), when I click Share, it can't display Sheet normally, and TipView can't be displayed either. Why?
Just now, we learned the exciting news that Vision Pro is about to be released! I would like to ask if there are any services such as discounts or rentals for developers? If so, can I experience it normally in China?
I found that my visionOS Simulator is very strange. Many functions and features are missing. For example, I learned from the Internet that the immersive scenes of Environments in their visionOS Simulator can be opened, but I click There was no response after the attack. There are not only these, but also many system features. I saw on the Internet that other developers have them, and I am missing. I'm worried that this will have an impact on me when testing my app. May I ask why?
Some information:
My updated Xcode version is the latest Xcode15.1Beta.
Device: iMac (2021)
Simulator system number: 21N305
I didn't find any errors in my program, and Xcode didn't report any errors in the program code, but when I ran it, it inexplicably reported an error:
Command CompileAssetCatalog failed with a nonzero exit code
What should I do?
I hope to be able to display the USDA model in RealityComposerPro and play the Spatial Audio. I used RealityView to implement these contents:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
content.add(entity)
guard let entity = entity.findEntity(named: "SpatialAudio"),
let resource = try? await AudioFileResource(named: "/Root/isWateringBasinAudio_m4a",
from: "isWateringBasin.usda",
in: RealityKitContent.realityKitContentBundle) else { return }
let audioPlaybackController = entity.prepareAudio(resource)
audioPlaybackController.play()
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
but when I ran it, I found that although can displayed the model normally, Spatial Audio failed to play normally. I hope to get guidance, thank you!
Is Persona open to developers? If it is open, in which document does it describe the information about Persona?
When the visionOS App is developed, when will it be submitted to the reviewer?
I implemented multiple languages through Localizable.strings in the visionOS App. Now I want to test whether multiple languages can work properly, so I want to change the system language in the settings before testing, but I can't find the relevant page in the settings. What should I do?
I developed a Plane-detection using ARKit in the visionOS app:
import SwiftUI
import RealityKit
import ARKit
struct ContentView: View {
@State private var ok = false
let session = ARKitSession()
let planeData = PlaneDetectionProvider(alignments: [.horizontal, .vertical])
var body: some View {
Group {
if !ok {
TabelView()
} else {
SwiftUIView()
}
}
.onAppear{
Task {
try await session.run([planeData])
for await update in planeData.anchorUpdates {
if update.anchor.classification == .table { continue }
switch update.event {
case .added, .updated:
ok = true
case .removed:
ok = false
}
}
}
}
}
}
When I ran it, I found that Xcode told me that it does not support Simulator, so how can I test this program? If there is no other way other than applying for VisionPro DevKit and participating in Lab, I hope you can tell me that this View is Can the following functions be realized:
When there is no table in the user's sight, SwiftUIView() will be displayed in the coordinates of x:0,y:0,z:0. If there is a table in the user's sight, TabbelView() will be displayed on the table.
If you can't realize the above functions, I hope you can give me some advice. Thank you!
In a visionOS app project, how to fix a view in the user's hand through ARKit.
In the visionOS app project, you can lock to wall / floor / ceiling / table / seat / window / door through Plane classifications. How can I lock a View to one of them and change a Bool variable to true after successful locking?
I want to play RealityKitContent USDA model's Spatial Audio, I use this code:
RealityView{ content in
do {
let entity = try await Entity(named: "isWateringBasin", in: RealityKitContent.realityKitContentBundle)
let audio = entity.spatialAudio
entity.playAudio(audio)
content.add(entity)
} catch {
print("Entity encountered an error while loading the model.")
return
}
}
entity.playAudio(audio) this code need add a 'AudioResource' back of audio, Excuse me, what should AudioResource be?
visionOS App how to play Spatial Audio, Ambient Audio and Channel Audio ?