Post

Replies

Boosts

Views

Activity

Is there an equivalent accessibilityElement API for SwiftUI - specifically when making visualizations using the new Canvas view in SwiftUI
What's the most effective equivalent to the accessibilityElement API when you're creating visualizations in SwiftUI using the new Canvas drawing mechanism? When I'm drawing with Canvas, I'd know the same roughly positioning information as drawing in UIKit or AppKit and making charts visualizations, but I didn't spot an equivalent API for marking sections of the visualization to be associated with specific values. (akin to setting the frame coordinates in a UIView accessibility element) I've only just started playing with Canvas, so I may be wrong - but it seems like a monolithic style view element that I can't otherwise easily break down into sub-components like I can by applying separate CALayers or UIViews on which to hang accessibility indicators. For making the equivalent bar chart in straight up SwiftUI, is there a similar mechanism, or is this a place where I'd need to pop the escape hatch to UIKit or AppKit and do the relevant work there, and then host it inside a SwiftUI view?
3
0
1.5k
Jun ’21
Can I generate a docc archive directly from Package.swift?
If the library I want to document is primarily tracked with a Package.swift cross-platform definition, what's the optimal way to build and render documentation? Can I still use xcodebuild docbuild with the Package.swift format for the project, or does the documentation system require a fully-defined Xcode project wrapped around it? Or do I need to open the Package.swift file in Xcode and then generate the docc archive from Xcode's Product menu?
1
0
1.9k
Jun ’21
ARView example code failing to compile on an M1 Mac & Xcode 12.4?
Hello, I recently downloaded the same from Creating a Game with SceneUnderstanding - https://developer.apple.com/documentation/realitykit/creating_a_game_with_sceneunderstanding, and the out-of-the-box compilation failed, with what appears to be some structural misses in RealityKit. This is with Xcode 12.4 (12D4e) - and a number of compilation complaints appear to be Objective-C elements not being correctly bridged, or otherwise unavailable to the compiler. Missing: ARView.RenderOptions .session within an ARView instance SceneUnderstanding from within ARView.Environment .showSceneUnderstanding from ARView.DebugOptions and a few others. I tried changing the deployment target from iOS 14.0 to iOS 14.4, clean build and rebuild, and wiping out DerivedData just to take a stab at things, but to no avail. Is there something I can/should update to get the scene understanding elements back available from the ARView class?
2
0
1.3k
Feb ’21
Can the tag pages for WWDC sessions please link to those sessions?
Each of the tag pages says that's related to a specific thing - the whole point of the tag. Quite number of recent tags are for WWDC sessions, which is awesome. However, those tag pages - while they list the WWDC session - don't provide any linked way of getting back to that session or its details. For example, the tag page: https://developer.apple.com/forums/tags/wwdc20-10091 burns up this huge visual space with a description of the session, but nothing in that content provide an actual *link* to the content: https://developer.apple.com/videos/play/wwdc2020/10091 The whole WWDC+Forums thing would be hugely improved by not treating these as separate resources, but allowing or encouraging them to intermix, and just adding a single active link from the tag page header to the session itself would be a good start.
1
0
800
Jul ’20
Is there a way to get a callback on custom lifecycle events exposed from the app?
I'm trying to capture snapshots as a visual validation of the state of a view in my UI tests. The work that I want to capture is after an async view is loaded (currently an embedded MapKit view). I'm wondering if there's a way that I can expose a marker when the Map has completed its loading and rendering to UIApplication() through the usability system, even if it's something I intentionally expose up from the app with my own code that isn't done by default. Right now I'm waiting on a Link to show up with: UIApplication().links["Legal"].waitForExistence(timeout: 2) as Map is currently exposing a link to legal from within itself after the rendering is set up, but that's a detailed side effect of MapKit specifically. For other views that my code loads asynchronously, can I do something to expose an event or callback book that I can react to within the UIApplication() structure to know that relevant background processing has completed, or is observing for UIElements to exist in some form the only exposed way of getting this kind of information?
0
0
879
Jul ’20
Is there a way to trigger MX*Diagnostic generation deterministically?
I love what I'm seeing about MX*Diagnostic (MXCrashDiagnostic - https://developer.apple.com/documentation/metrickit/mxcrashdiagnostic and friends) and the capability to use it to track down crashes and issues. What I'd like to know is if there's a way I can trigger the diagnostic output deterministically at the end of the CI-driven integration test so that I can build in a scan for these and harvest them to help identify regressions. Without such na mechanism, am I left having to leave the device (or simulator) running for a huge amount of time and hope for the best? I want to use this specifically with CI, which may have multiple runs per day - and would like to identify which runs of code were aligned without something akin to a "spend a day testing this single iteration of the code" kind of constraint.
2
0
882
Jun ’20
Tagging all the existing articles to make them visible
There are a lot of articles and questions already on the forums from the past years - but which didn't include tags since they weren't part of the system earlier. However, I can still see my past articles - so it would be nice if I could go back and retroactively tag those articles. Then, perhaps, there's be at least ONE article related to Combine in the forums... and with a great answer from Quinn that really helped me.
1
0
659
Jun ’20
IBDesignable view shows "updating", and not displaying custom view in InterfaceBuilder
I've been trying to make and view an @IBDesignable view in Interface Builder, but unfortunately with little success.When I select the view in my Storyboard, the Custom Class inspector shows the Designables: highlight as "updating", without any changes regardless of my actions.I've done a clean build and build, and the build compiles with no warnings.When I run the code in the simulator, I'm able to see the resulting visual code and there's no warnings or errors, but it doesn't appear to be showing up in the Interface Builder.I've tried removing the DerivedData and then restarting my laptop and Xcode to clear any ephemeral issues, to no effect.I have "Automatically Refresh Views" disabled at the moment - I've tried it both enabled and disabled, with no effect.When I attempt to select the view and choose Editor -> Debug Selected Views, the status spinner at the top of XCode shows "Preparing to debug instance of "GridColorSelectionView" - and an error is reported in Xcode:The specific PID changes with the attempts, but the error code is consistent. I've looked in ~/Library/Logs/DiagnosticReports/, but not seen anything that would give me a hint of what might be failing.Is there something I'm missing, a setting or something - or does anyone have a suggestion on how I might be able to debug this issue?I'm not conversant enough to know if this is an issue with something I've done (or am doing), or if it's a candidate for a bug report on Xcode.- This is with the latest Xcode - Version 10.2 (10E125), running on MacOS 10.14.4 (18E226)
5
1
5.6k
Mar ’19