Posts

Post not yet marked as solved
1 Replies
102 Views
Today I have tried to add a second archive action for visionOS. I had added a visionOS destination to my app target a while back and can build and archive my app for visionOS in Xcode 15.3 locally, and also run it on the device. Xcode Cloud is giving me the following errors in the Archive - visionOS action (Archive - iOS works): Invalid Info.plist value. The value for the key 'DTPlatformName' in bundle MyApp.app is invalid. Invalid sdk value. The value provided for the sdk portion of LC_BUILD_VERSION in MyApp.app/MyApp is 17.4 which is greater than the maximum allowed value of 1.2. This bundle is invalid. The value provided for the key MinimumOSVersion '17.0' is not acceptable. Type Mismatch. The value for the Info.plist key CFBundleIcons.CFBundlePrimaryIcon is not of the required type for that key. See the Information Property List Key Reference at https://developer.apple.com/library/ios/documentation/general/Reference/InfoPlistKeyReference/Introduction/Introduction.html#//apple_ref/doc/uid/TP40009248-SW1 All 4 errors are annotated with "Prepare Build for App Store Connect" and I get them for both "TestFlight (Internal Testing Only)" and "TestFlight and App Store" deployment preparation options. I have tried to remove the visionOS destination and add it back, but this is not changing the project at all. Any ideas what I am missing?
Posted
by RK123.
Last updated
.
Post not yet marked as solved
1 Replies
484 Views
Hi! I was able to setup Xcode Cloud and build manually without problems. Also, GitHub integration with Xcode works, I can commit and push changes to my GitHub repo. Unfortunately, I am not able to trigger a Xcode Cloud build by pushing changes to my repo. Tried git push via Xcode 15.2 and via command line. The push itself always works. I was under the impression this should "just work", if the workflow includes a start condition set to Branch Changes, Any Branch, no custom conditions. I have one single branch. Fails silently, no visible activity, neither in Xcode nor in the App Connect portal. Any ideas? Am I missing something? Best regards
Posted
by RK123.
Last updated
.
Post marked as solved
1 Replies
1.4k Views
Hi! I am getting all debug output in text only format, without structure or color. I only noticed now after watching the respective WWDC session in Xcode 15 beta 8, but I am pretty sure this was the case in the previous betas as well. Is the new structured debug console activated by some setting I am overlooking?
Posted
by RK123.
Last updated
.
Post not yet marked as solved
1 Replies
709 Views
Hi! While waiting for scene understanding to work in the visionOS simulator, I am trying to bounce a virtual object off a real world wall or other real world object on iOS ;). I load my virtual objects from a Reality Composer file where I set them to participate in physics with dynamic motion type. With this I am able to have them collide with each other nicely, occlusion also works, but they go right through walls and other real world objects rather than bouncing off... I've tried a couple of variations of the following code: func makeUIView(context: Context) -> ARGameView { arView.environment.sceneUnderstanding.options.insert(.occlusion) arView.environment.sceneUnderstanding.options.insert(.physics) arView.debugOptions.insert(.showSceneUnderstanding) arView.automaticallyConfigureSession = false let config = ARWorldTrackingConfiguration() config.planeDetection = [.horizontal, .vertical] config.sceneReconstruction = .meshWithClassification arView.session.run(config) if let myScene = try? Experience.loadMyScene() { ... arView.scene.anchors.append(myScene) } return arView } I have found several references that his should "just work", e.g. in https://developer.apple.com/videos/play/tech-talks/609 What am I missing? Testing on iPhone 13 Pro Max with iOS 16.5.1 🤔
Posted
by RK123.
Last updated
.
Post not yet marked as solved
2 Replies
1.6k Views
Hi! With the following code I am able to receive the location of taps in my RealityView. How do I find out which of my entities was tapped (in order to execute an animation or movement)? I was not able to find anything close to ARView's entity(at:) unfortunately. Am I missing something, or is this not possible in the current beta of visionOS? struct ImmersiveView: View { var tap: some Gesture { SpatialTapGesture() .onEnded { event in print("Tapped at \(event.location)") } } var body: some View { RealityView { content in let anchor = AnchorEntity(.plane(.horizontal, classification: .table, minimumBounds: [0.3, 0.3])) // adding some entities here... content.add(anchor) } .gesture(tap.targetedToAnyEntity()) } }
Posted
by RK123.
Last updated
.
Post not yet marked as solved
4 Replies
977 Views
Hi! despite the documentation saying otherwise, MultipeerConnectivityService appears to be unavailable in visionOS. Am I missing something or is this an issue with the current beta or (hopefully not 😬) the documentation? https://developer.apple.com/documentation/realitykit/multipeerconnectivityservice do { entity.scene?.synchronizationService = try MultipeerConnectivityService(session: MultipeerSession.shared.session) print("RealityKit synchronization started.") } catch { fatalError("RealityKit synchronization could not be started. Error: \(error.localizedDescription)") } Xcode complains 'MultipeerConnectivityService' is unavailable in visionOS, while the scene's synchronizationService property can be accessed...
Posted
by RK123.
Last updated
.
Post marked as solved
2 Replies
1.5k Views
I am able to obtain daily forecasts via the REST API which include the daytimeForecast and overnightForecast records with their respective properties like cloudCover. So far I have failed to access them via the Swift API: forecast.daytimeForecast.cloudCover results in: "Value of type 'DayWeather' has no member 'daytimeForecast'". But if I print() the DayWeather objects, these members are actually printed with all their properties. Is there a way to get them I am overlooking?
Posted
by RK123.
Last updated
.