Let's say I am a VoiceOver user that wants to use the rotor option "Headers" to iterate through supplementary header views of a UICollectionView. This behavior works as expected if the number of cells between each header view is less than the height of the UICollectionView. However, if there are lots of items between the visible header view and the adjacent one that is hidden, the device currently says "header not found" and requires a three-finger swipe to eventually bring the header into view.
I can understand the technical reason behind this; UICollectionView does not actually have everything loaded into memory and reuses cells to give the impression that it does, so the device is technically not finding the view. Does that mean that VoiceOver users are quite used to hearing "Heading not found" and using the three-finger swipe motion as a workaround to this issue? Or is this an actual bug?
I don't see much discussion of this in the forums either (I apologize if this has been answered anywhere else), so I thought of posting this question here. Thanks!
Post
Replies
Boosts
Views
Activity
Let's say I have a music library view controller comprising a UICollectionView, where the cells (that are songs) are grouped alphabetically, and a cell can either be a song or a loop in the respective song (e.g. the image attached).
My question is whether there's a way VoiceOver users can swipe through the headings and song cells but skip over the loop cells in the same way the rotor allows jumping from one header to another. If I'm a VoiceOver user and I know that the loop I'm looking for is not in "Sisters", there's no point in having to go through "Lydian lick 2" and "WOW!"; it should just skip to "T'Wonderful".
Is this possible to implement given the accessibility API that is currently out there? I know I can certainly program it to be nested (i.e. pressing "Sisters >" would change to "Sisters /" and the "Lydian lick 2" and "WOW!" cells would appear) but I like that a user only has to do one press to open either the song or loop (not to mention I now don't have a way of loading the song itself).
Does anyone have any suggestions on how I can improve this design such that it minimizes number of gestures required to open a song/loop, while making it easy for VoiceOver users to skip over loops they know is not in a given song? It'd be highly appreciated!!!
I'm trying to display a list of songs using UICollectionView's List API. I would also like them grouped by the first letter of the song. At first, I thought I'd have something like this:
let sections = [
LibrarySection(name: A, songs: [
Song(name: "A Song"),
Song(name: "Another Song"),
]),
LibrarySection(name: B, songs: [
Song(name: "B Song"),
Song(name: "B another Song"),
]),
]
var snapshot = NSDiffableDataSourceSnapshot<LibrarySection, LibraryRow>()
snapshot.appendSections(sections)
sections.forEach { section in
snapshot.appendItems(section.songs, toSection: section)
}
dataSource.apply(snapshot, animatingDifferences: true)
But I facepalmed when I realized the smart thing to do is this:
let items = [
Song(name: "A Song"),
Song(name: "Another Song"),
Song(name: "B Song"),
Song(name: "B another Song")
]
The question I have is, what would the snapshot logic look like such that I am minimizing the amount of times I have to instantiate items and sections?
var snapshot = NSDiffableDataSourceSnapshot<LibrarySection, LibraryRow>()
???
dataSource.apply(snapshot, animatingDifferences: true)
Thanks for any feedback you may have!!!
Let's say I have an SPM package I've created called PkgA, this package has an external dependency on another SPM package called PkgB, and that there is an app that depends on PkgA.
What I would like is to import PkgA in the app and be able to immediately use content of PkgB in the app, in the same way importing UIKit imports UIViewController, UIView, etc. That is to say, I want PkgA to be analogous to UIKit (so it should really be more like KitA than PkgA).
The issue I'm facing is that it's not behaving in this manner. I'd like to think that if the PkgA target compiles PkgA.swift and the file contains an 'import PkgB', that the library would automatically include PkgB. What's weird is that I get a message like "Cannot find 'ClassB' in scope", in the app but when I hold the command button, select 'ClassB' and press "Jump to Definition", Xcode immediately goes to the declaration. So Xcode does know exactly where it is, despite the error message.
One workaround I have found is adding the following to PkgA.swift:
// PkgA.swift
import PkgB
// typealias is required for app to find ClassB
public typealias ClassB = PkgB.ClassB
This would allow me to get the following to work in my app:
// App
import PkgA
let foo = ClassB()
It is certainly a poor workaround because I can no longer rely on using the Option button to quickly see documentation of ClassB. It overall seems pretty hacky...
Now, I can certainly get around everything by just importing PkgB:
// App
import PkgB
let foo = ClassB()
However, I plan on adding tons of dependencies and I don't like the idea of having lots of imports (in the same way users are not importing UIViewController, UISearchBar, UIAccessibility, etc). I essentially want PkgA to act like a kit, and I'm unfortunately hit a wall with this. I even attended a lab for WWDC2021 and the person I was with was completely stumped.
Anyone have any idea what I could possibly be missing? I've been stumped for weeks :(
Hello there!
Does anyone know how it’s possible to play an audio file within an iOS app below 25% (or 0% or even backwards). I can’t see how it’s possible to slow down audio below 25% when AVAudioNodes do not have the ability to set a rate below 0.25, and yet an app called AudioStretch is not only is able to do that, it can also play a point in time indefinitely and even play a song backwards? Unfortunately, I can’t seem to include a YouTube link to a demo on this post but if you do a search on YouTube for AudioStretch and, you’ll see a demo by Dream Theater musician Jordan Rudess.
How is this physically possible on iOS (especially that the song can play a moment in time forever, not just very very very slow)? Any insight would be greatly appreciated!!! I really want to implement these features on an iOS app I am working on but have not had any success!
Andres