Post

Replies

Boosts

Views

Activity

Even slight main thread congestion causes Alert to miss dismissals
One frequent but hard to reproduce bug in SwiftUI is being unable to dismiss a basic Alert. I think I've managed to reproduce the issue here. If there is even a little congestion in the main thread, here in the form of a counter the updates every 0.1 seconds, the Alert will fail to dismiss 90% of the time. This level of activity should be nothing remarkable. It will dismiss sometimes but usually does not. See attached project. Sample project for Alert bug Feedback case: https://feedbackassistant.apple.com/feedback/14719229
4
0
334
Aug ’24
NavigationStack usage adding unwanted position animation
Several buttons with no position animations on them will begin to move around when placed inside a NavigationStack. If you comment out the NavigationStack the buttons will remain still and only the desired glow animation will occur. Also this only seems to occur when the button animation toggles based on a @Published Bool, not with a @State Bool Demonstration code
2
0
273
Jul ’24
is tab key functionality meant to be working in SwiftUI?
I'm trying to get my app to function like the iOS Contacts app where either the Tab key or the Enter key can be used to progress from field to field. I have things working for the Enter key by changing the FocusState when onSubmit is called, but this doesn't work for the Tab key. This isn't anything to do with system settings since iOS Contacts works as expected. How am I supposed to listen for the tab key in iOS to change the focus?
0
0
522
Oct ’22
What if they tap the submit button instead of hitting Done or Enter?
For https://developer.apple.com/videos/play/wwdc2021/10023/?time=411 I have a similar case in my app, but SwiftUI doesn't seem to be able to sanely handle the case where the user hits the submit button after typing instead of hitting Return or the "Done" button like they do in the video. In my case I have a focus listener on the Textfield and want to perform a custom action when the Textfield loses focus. If they manually hit the submit button in the UI instead of using the keyboard, the textfield doesn't lose focus or at least the handler isn't called. Is there some way to make sure that the lose focus handler is called on the text field when they tap the submit without a bunch of kludgey code creating dependencies between the components?
0
0
905
Sep ’22
CoreML image orientation
We're having problems with images being sent to our CoreML model at unsupported image orientations. I'm setting the orientation parameter in VNImageRequestHandler, and I'm also using the OCR model so I know that the parameter is being sent correctly. At https://developer.apple.com/documentation/vision/classifying_images_with_vision_and_core_ml it says Most models are trained on images that are already oriented correctly for display. To ensure proper handling of input images with arbitrary orientations, pass the image’s orientation to the image request handler. but I'm not sure whether the rotation is supposed to happen automatically or whether they're implying that the model can read the orientation and rotate the image accordingly.
1
0
1k
Jul ’21
Understanding Anchor transform vs trackedRaycast transform
I'm looking through the ARKitInteraction sample app, and there are a few things I don't really understand about how the trackedRaycast transform and the ARAnchor transform area related.1. In the trackedRaycast handler, the result of the raycast is assigned to the SCNNode's transform. However, that object is already attached to an ARAnchor, and in the documentation, it says "Adding an anchor to the session helps ARKit to optimize world-tracking accuracy in the area around that anchor, so that virtual objects appear to stay in place relative to the real world. If a virtual object moves, remove the corresponding anchor from the old position and add one at the new position."Doesn't this mean the Anchor and the trackedRaycast are giving conflicting instructions?2. In the ARSCNViewDelegate'sfunc renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor)implementation, there's the callobjectAtAnchor.simdPosition = anchor.transform.translationDoesn't the fact that didUpdate is being called mean that the anchor transform was already set to the object transform? I'm guessing this is to help the trackedRaycast functionality work with the anchor functionality, but I don't see how.
1
0
1.1k
May ’20
Does raycasting have an equivalent of ARHitTestResult.ResultType.existingPlaneUsingExtent ?
I'm trying to convert some hitTesting code to use raycasting, but I don't see an equivalent of ARHitTestResult.ResultType.existingPlaneUsingExtent? "allowing: .estimatedPlane, alignment: .vertical" returns null for some of the anchors when .existingPlaneUsingExtent returns values for all the anchors, and "allowing: .estimatedPlane, alignment: .any" doesn't seems to return values for any anchors, just the featurePoint normals. Not sure if I'm missing something or this is a bug or what.
1
0
845
Apr ’20
Does changing the transform of an ARAnchor's SCNNode break the AR?
When an ARAnchor is created, it is passed a transform with a real-world position and orientation where it should be located. Then that ARAnchor gets an SCNNode which uses that transform as an origin, and as the ARAnchor's connection to the real-world is adjusted, the SCNNode's location is also adjusted. What happens if the SCNNode's transform is set to a new value, something like SCNMatrix4(hitTestResult.worldTransform)? Is the connection to the ARAnchor lost and is the SCNNode just floating around in SceneKit space with no direct connection to the real world?
1
0
881
Apr ’20
SCNPlane y position
At https://developer.apple.com/documentation/arkit/arscnview/providing_3d_virtual_content_with_scenekitthere is the line of code:planeNode.position = SCNVector3Make(planeAnchor.center.x, 0, planeAnchor.center.z)Why is the y coordinate 0? Why not use the y coordinate of the anchor?
1
0
895
Mar ’20