Posts

Post not yet marked as solved
7 Replies
21k Views
On XCode 14.3.1, downloading support files for iPadOS 17, fails with message: **Could not locate device support files You may be able to resolve the issue by installing the latest version of Xcode from the Mac App Store or developer.apple.com. [missing string: 869a8e318f07f3e2f42e11d435502286094f76de]** on XCode 15.0 beta (15A5160n), iPad does not show up. Have rebooted Mac, iPad, Settings > Clear Trusted Computers => no iPad in XCode 15 => missing support files message on XCode 14.3.1 Am stuck. How to get unstuck quickly?
Posted
by Muse.
Last updated
.
Post not yet marked as solved
1 Replies
336 Views
Different results in XCode Beta 7 and Beta 8 on VisionOS simulator: Beta 7: the following compiles but crashes (on beta 8 Simulator?) private func createPoseForTiming(timing: LayerRenderer.Frame.Timing) -> Pose? { ... if let outPose = worldTracking.queryPose(atTimestamp: t) { // will crash (Beta 7 binary on 8 simulator?) } } Beta 8: compile error "Value of type 'LayerRenderer.Drawable' has no member 'pose'" private func createPoseForTiming(timing: LayerRenderer.Frame.Timing) -> OS_ar_pose? { ... if let outPose = worldTracking.queryPose(atTimestamp: t) { // "Value of type 'LayerRenderer.Drawable' has no member 'pose'" } } Changing Pose to OS_ar_pose for Beta 8, was recognized. But, new compile error regarding queryPose Did notice that docs say that Pose prediction is expensive. I could have sworn that there was a deprecation in the headers, but could not find. Is Pose deprecated for VisionOS? What is the way forward?
Posted
by Muse.
Last updated
.
Post not yet marked as solved
3 Replies
783 Views
Is there an example app? Am writing a visual music app. It uses a custom SwiftUI menus over a UIKit (UIHostingController) canvas. The metal pipeline is custom. A fully functional example app would be super helpful. Nothing fancy; an immersive hello triangle would be a great start. Later, will need to port UITouches to draw and CMMotionManager to track pov within a Cubemap. Meanwhile, baby steps. Thanks!
Posted
by Muse.
Last updated
.
Post marked as solved
3 Replies
589 Views
This is an spinoff from forum post xrOS+Metal example code here Am porting Warren Moore's example here Have the most of the conversion to Swift complete but am stuck at a couple places ar_data_providers_create_with_data_providers(...) cp_time_to_cf_time_interval(...) Any hints on how to track these down? Thanks!
Posted
by Muse.
Last updated
.
Post marked as solved
2 Replies
1.3k Views
I'm implementing a SwiftUI control over a UIView's drawing surface. Often you are doing both at the same time. Problem is that starting with the SwiftUI gesture will block the UIKit's UITouch(s). And yet, it works when you start with a UITouch first and then a SwiftUI gesture. Also need UITouch's force and majorRadius, which isn't available in a SwiftUI gesture (I think). Here's an example: import SwiftUI struct ContentView: View { @GestureState private var touchXY: CGPoint = .zero var body: some View { ZStack { TouchRepresentable() Rectangle() .foregroundColor(.red) .frame(width: 128, height: 128) .gesture(DragGesture(minimumDistance: 0) .updating($touchXY) { (value, touchXY, _) in touchXY = value.location}) .onChange(of: touchXY) { print(String(format:"SwiftUI(%.2g,%.2g)", $0.x,$0.y), terminator: " ") } //.allowsHitTesting(true) no difference } } } struct TouchRepresentable: UIViewRepresentable { typealias Context = UIViewRepresentableContext<TouchRepresentable> public func makeUIView(context: Context) -> TouchView { return TouchView() } public func updateUIView(_ uiView: TouchView, context: Context) {} } class TouchView: UIView, UIGestureRecognizerDelegate { func updateTouches(_ touches: Set<UITouch>, with event: UIEvent?) { for touch in touches { let location = touch.location(in: nil) print(String(format: "UIKit(%g,%g) ", round(location.x), round(location.y)), terminator: " ") } } override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) } override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) } override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) } override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) } } Because this is multitouch, you need to test on a real device. Sequence A: finger 1 starts dragging outside of red square (UIKit) finger 2 simultaneously drags inside of red square (SwiftUI) ⟹ Console shows both UIKit and SwiftUI touch positions Sequence B: finger 1 starts dragging inside of red square (SwiftUI) finger 2 simultaneously drags outside of red square (UIKit) ⟹ Console shows only SwiftUI touch positions Would like to get both positions in Sequence B.
Posted
by Muse.
Last updated
.
Post not yet marked as solved
1 Replies
995 Views
Since UIScreen isn't available, is there a drop in replacement for displaylink? UIScreen.main.displayLink(withTarget: self, selector: #selector(nextFrames)) Error: UIScreen unavailable for xrOS Related WWDC video: https://developer.apple.com/videos/play/wwdc2023/111215/?time=176 Link to documentation? I also used UIDeviceOrientation, but am hoping there is a snippet for synchronizing Metal frames.
Posted
by Muse.
Last updated
.
Post not yet marked as solved
2 Replies
595 Views
Am referring Apple's @State documentation: https://developer.apple.com/documentation/swiftui/state How does an outside class or struct observe a change in @State within a SwiftUI View? For example, using Apple's Button example in the link above, how would a change in @State isPlaying actually do something like start or stop a media player? Do I attach a callback, use a KVO, or use a special observer?
Posted
by Muse.
Last updated
.
Post not yet marked as solved
1 Replies
944 Views
New error in XCode 13 Beta 4: "Generic parameter 'Success' could not be inferred" No error in earlier XCode 13 Beta. What's the fix? THe above line is from Apple example code: "DrawTogether"
Posted
by Muse.
Last updated
.
Post not yet marked as solved
1 Replies
1.6k Views
How to implement local speech recognition exclusively on WatchOS? I think there was mention of local ASR, this year? Use case is for a visually impaired person requesting information by raising their wrist and speaking a small number of words for a variety of intents. Because they would be walking with a white cane, the UI must be hands free. This works for sighted folks, as well. The number of intents are too diverse for creating individual Siri shortcuts. Ideally, the Watch App would call something akin to SFSpeechAudioBufferRecognitionRequest. Is this available? Where can I find the documentation?
Posted
by Muse.
Last updated
.
Post not yet marked as solved
0 Replies
741 Views
I'm trying to setup a new gmail account in mail.app1) Google requires completing authentication in Safari. After authentication, set up will continue in Internet Accounts.2) Authentication Failed. After authentication, set up will continue in Internet Accounts.
Posted
by Muse.
Last updated
.
Post not yet marked as solved
0 Replies
905 Views
Could a user onboard an Apple Watch with WatchOS6 without an iPhone? This would be akin to how iPod supported Windows, which resulted in an order of magnitude more growth for the iPod. We're looking at an social audio app in Indonesia and US. In Indonesia, Andriod adoption is 93%, to iPhone 5%. Ostensibly, we could get the app working exclusively on the Apple Watch. But, that implies onboarding with Android, Windows, Browser, or Mac.
Posted
by Muse.
Last updated
.