Am developing an iOS App, which uses a ZipFoundation wrapper around Compression. In XCode, have exported a document type with extension '.MU' in the Info.plist.
On iPhone, when attempting to open archive called: 'Snapshot-test.mu'
can OPEN as a mobile email attachment
but FAILED via Files App referring to "iCloud Drive/Desktop"
Here are the respective URLS
"file:///private/var/mobile/Containers/Data/Application/<UniqueID>/Documents/Inbox/Snapshot-test.mu"
"file:///private/var/mobile/Library/Mobile%20Documents/com~apple~CloudDocs/Desktop/Snapshot-test1.mu"
Two questions:
Is it possible to grant access to files residing remotely in iCloud?
Is "iCloud Drive/Desktop" unique, whereas other iCloud locations would be OK?
Post
Replies
Boosts
Views
Activity
On XCode 14.3.1, downloading support files for iPadOS 17, fails with message:
**Could not locate device support files
You may be able to resolve the issue by installing the latest version of Xcode from the Mac App Store or developer.apple.com.
[missing string: 869a8e318f07f3e2f42e11d435502286094f76de]**
on XCode 15.0 beta (15A5160n), iPad does not show up.
Have rebooted Mac, iPad, Settings > Clear Trusted Computers
=> no iPad in XCode 15
=> missing support files message on XCode 14.3.1
Am stuck. How to get unstuck quickly?
I see the code snippets for wwdc2023-10111. is there a project? Would be a great time-saver. Thanks!
Different results in XCode Beta 7 and Beta 8 on VisionOS simulator:
Beta 7: the following compiles but crashes (on beta 8 Simulator?)
private func createPoseForTiming(timing: LayerRenderer.Frame.Timing) -> Pose?
{ ...
if let outPose = worldTracking.queryPose(atTimestamp: t) {
// will crash (Beta 7 binary on 8 simulator?)
}
}
Beta 8: compile error "Value of type 'LayerRenderer.Drawable' has no member 'pose'"
private func createPoseForTiming(timing: LayerRenderer.Frame.Timing) -> OS_ar_pose?
{ ...
if let outPose = worldTracking.queryPose(atTimestamp: t) {
// "Value of type 'LayerRenderer.Drawable' has no member 'pose'"
}
}
Changing Pose to OS_ar_pose for Beta 8, was recognized. But, new compile error regarding queryPose
Did notice that docs say that Pose prediction is expensive. I could have sworn that there was a deprecation in the headers, but could not find.
Is Pose deprecated for VisionOS? What is the way forward?
Do I need a real device to get HandTrackingProvider to work?
Was hoping to map gestures to a new Spatial Menu/Palette here. Am kinda stuck.
Thanks!
Is there an example app?
Am writing a visual music app. It uses a custom SwiftUI menus over a UIKit (UIHostingController) canvas. The metal pipeline is custom.
A fully functional example app would be super helpful. Nothing fancy; an immersive hello triangle would be a great start.
Later, will need to port UITouches to draw and CMMotionManager to track pov within a Cubemap. Meanwhile, baby steps.
Thanks!
This is an spinoff from forum post xrOS+Metal example code here
Am porting Warren Moore's example here
Have the most of the conversion to Swift complete but am stuck at a couple places
ar_data_providers_create_with_data_providers(...)
cp_time_to_cf_time_interval(...)
Any hints on how to track these down? Thanks!
I'm implementing a SwiftUI control over a UIView's drawing surface. Often you are doing both at the same time. Problem is that starting with the SwiftUI gesture will block the UIKit's UITouch(s). And yet, it works when you start with a UITouch first and then a SwiftUI gesture. Also need UITouch's force and majorRadius, which isn't available in a SwiftUI gesture (I think).
Here's an example:
import SwiftUI
struct ContentView: View {
@GestureState private var touchXY: CGPoint = .zero
var body: some View {
ZStack {
TouchRepresentable()
Rectangle()
.foregroundColor(.red)
.frame(width: 128, height: 128)
.gesture(DragGesture(minimumDistance: 0)
.updating($touchXY) { (value, touchXY, _) in touchXY = value.location})
.onChange(of: touchXY) {
print(String(format:"SwiftUI(%.2g,%.2g)", $0.x,$0.y), terminator: " ") }
//.allowsHitTesting(true) no difference
}
}
}
struct TouchRepresentable: UIViewRepresentable {
typealias Context = UIViewRepresentableContext<TouchRepresentable>
public func makeUIView(context: Context) -> TouchView { return TouchView() }
public func updateUIView(_ uiView: TouchView, context: Context) {}
}
class TouchView: UIView, UIGestureRecognizerDelegate {
func updateTouches(_ touches: Set<UITouch>, with event: UIEvent?) {
for touch in touches {
let location = touch.location(in: nil)
print(String(format: "UIKit(%g,%g) ", round(location.x), round(location.y)), terminator: " ") }
}
override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesMoved(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesEnded(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
override func touchesCancelled(_ touches: Set<UITouch>, with event: UIEvent?) { updateTouches(touches, with: event) }
}
Because this is multitouch, you need to test on a real device.
Sequence A:
finger 1 starts dragging outside of red square (UIKit)
finger 2 simultaneously drags inside of red square (SwiftUI)
⟹ Console shows both UIKit and SwiftUI touch positions
Sequence B:
finger 1 starts dragging inside of red square (SwiftUI)
finger 2 simultaneously drags outside of red square (UIKit)
⟹ Console shows only SwiftUI touch positions
Would like to get both positions in Sequence B.
Since UIScreen isn't available, is there a drop in replacement for displaylink?
UIScreen.main.displayLink(withTarget: self, selector: #selector(nextFrames))
Error: UIScreen unavailable for xrOS
Related WWDC video: https://developer.apple.com/videos/play/wwdc2023/111215/?time=176
Link to documentation? I also used UIDeviceOrientation, but am hoping there is a snippet for synchronizing Metal frames.
Am referring Apple's @State documentation: https://developer.apple.com/documentation/swiftui/state
How does an outside class or struct observe a change in @State within a SwiftUI View?
For example, using Apple's Button example in the link above, how would a change in @State isPlaying actually do something like start or stop a media player? Do I attach a callback, use a KVO, or use a special observer?
Would like to play with this example project shown on session wwdc21-10187
New error in XCode 13 Beta 4: "Generic parameter 'Success' could not be inferred"
No error in earlier XCode 13 Beta.
What's the fix? THe above line is from Apple example code: "DrawTogether"
On MBP with XCode on Desktop 1, swiping to Desktop 2 shows the same XCode IDE overlayed on top. I want to XCode stay only on Desktop 1 and not appear on Desktop 2. How to fix?
How to implement local speech recognition exclusively on WatchOS? I think there was mention of local ASR, this year?
Use case is for a visually impaired person requesting information by raising their wrist and speaking a small number of words for a variety of intents. Because they would be walking with a white cane, the UI must be hands free. This works for sighted folks, as well. The number of intents are too diverse for creating individual Siri shortcuts.
Ideally, the Watch App would call something akin to SFSpeechAudioBufferRecognitionRequest. Is this available? Where can I find the documentation?