Awesome! Thank you!
Post
Replies
Boosts
Views
Activity
I have tried the multiple options so far. That entire .onChange seems like a bolt-on afterthought to be honest. I am sure it will work great in the future, but for now, it's not only not being called when you follow the examples (you must put it in a View), but it only gets called on change, doesn't have the initial launch, final termination, and will not help transitions between scenes either.
So for the time being, the .onChange would be good for the View, but would get called independently on every single instance of the app, which is not good if you have a central model expecting one single call.
And as described by @mtsrodrigues - https://developer.apple.com/forums/profile/mtsrodrigues, you can create an AppDelegate, which will work good. But you don't have access to the App stack, you must make it go through hoops to get it and get your model, which might not be what you actually want. To get non-hackish, you'd put your app model in your AppDelegate, and have your view model caches registering to your AppDelegate so they'd get called on events, which is not what is expected.
So far, the best solution I found was to go on the different sites of my model where the App-wide delegation happens, and add Notification observers for each site. Bonus is it makes your code truly modular. Malus is they haven't yet created a SwiftUI-compatible platform-agnostic version, so you must ifdef your way.
Here is my code in my AppEnvironment @State object, that's now residing as a singleton in my App. (Please note my final version has all this code - and much more - directly in store and photoLibrary as there's no reason to centralize it)
#if os(macOS)
NotificationCenter.default.addObserver(
forName: NSApplication.didFinishLaunchingNotification,
object: nil,
queue: .main)
{ [weak self] notification in
self?.store.enabled = true
}
NotificationCenter.default.addObserver(
forName: NSApplication.willTerminateNotification,
object: nil,
queue: .main)
{ [weak self] notification in
self?.store.enabled = false
}
#else
NotificationCenter.default.addObserver(
forName: UIApplication.didFinishLaunchingNotification,
object: nil,
queue: .main)
{ [weak self] notification in
self?.store.enabled = true
}
NotificationCenter.default.addObserver(
forName: UIApplication.willTerminateNotification,
object: nil,
queue: .main)
{ [weak self] notification in
self?.store.enabled = false
}
NotificationCenter.default.addObserver(
forName: UIApplication.didReceiveMemoryWarningNotification,
object: nil,
queue: .main)
{ [weak self] notification in
self?.photoLibrary.clearCache()
}
#endif
Replying to my own question. Ended up asking for a Technical Incident. Although I didn't get the answer I was looking for, it helped me push some more, and after some personal trial and errors, I can at least have a better idea of the current printer's format by doing
static let defaultBestPaper = UIPrintPaper.bestPaper(forPageSize: CGSize(width: 595, height: 842), withPapersFrom: [])
before creating my first document. This will return the current printer's default paper and margin. the CGSize is merely an A4 paper, put whatever you want in there. Letter, A4, ...
The very obvious caveats are the inability to know if a printer is actually selected, if the value makes sense, if the user will choose another piece of paper. This is merely useful to get a general idea and should not be used as a final, or even useable value. But between that and a 0,5in margin approximation and using the countries for paper size, I find it not too bad.
Thank you for your answer, ddijitall - https://developer.apple.com/forums/profile/ddijitall. We are in the same boat. Although the features are working and I understand why they go that route, it lacks polishing.
Cheers and happy holidays!
I had a lot of "fun" finding a solution for this one. As explained by the Frameworks Engineer, the problem lies in Apple's JPEG transcoding. You actually ask for an "Image" representation, and here lies the issue.
In the Simulator example images, they are helpful in providing multiple image formats. One includes a ".heic" file, that's not a JPEG. And this is the one that usually borks.
My solution is going through all the provided representations for every photo, figure out what it is, and only transcode as a last resort. With this, I haven't had any No such file or directory since then.
let supportedRepresentations = [UTType.rawImage.identifier,
UTType.tiff.identifier,
UTType.bmp.identifier,
UTType.png.identifier,
UTType.heif.identifier,
UTType.heic.identifier,
UTType.jpeg.identifier,
	 UTType.webP.identifier,
UTType.gif.identifier,
]
for representation in supportedRepresentations {
	 if result.itemProvider.hasRepresentationConforming(toTypeIdentifier: representation, fileOptions: .init()) {
result.itemProvider.loadInPlaceFileRepresentation(forTypeIdentifier: representation) { (originalUrl, inPlace, error) in
Thank you Maven for the reply!
I am unsure whether this is the good answer, as it has nothing to do with TMH, but until someone authoritatively says it's something else, I'll give your Checkbox! Waited 1 month to give some chance.
Honestly, lastcookie, I gave up. I removed any discrepancies in Portrait vs Landscape and gave up being fancy. It’s one of my grudges on SwiftUI: if you don’t follow the intents and purposes, and you are trying to be like a system app and follow good practices, it will mostly work, but not all the time.
@Claude31 - I mean I have a 32-bit image with Alpha channel, so my pixels have (r,g,b,255) for fully opaque, and (r,g,b,0) for fully transparent. Since the image has transparency in it, tapping in a place where you see underneath is totally counter-intuitive.
@OOPer I only thought of the code, I don't have the actual code itself, but it mostly boils down to doing a side channel to the traditional Tap, through a tap delegate, and the global coordinates are sent in Environment. Then, they are processed by any object requiring a tap, where the objects are responsible to know if they fall on top of themselves (through Geometry checks), as well as the actual check in the image's pixel, for its colouring (probably be worthwhile to do a 5-tap so it's not too precise). Lastly, the objects need to have a holder to know their respective Z ordering (mine would be in model, as I already know the innate z ordering).
The problem is it doesn't actually scale, as every tappable objects would need to listen to this. So an optimization might be to keep track of the global geometries of all the objects in an ordered list, and go down that list through the manager itself, foregoing the entire SwiftUI system itself, then have a callback to the object to say "you got tapped there, want it?".
So ... mostly, recreating an entire artificial makeshift slow event passing system on the side of SwiftUI just because...
There might be other solutions, but if none exists, I would propose Apple to add a Gesture handler to tell if it can "accept", "drain" or "ignore" a Gesture, alongside local coordinates, and the object could give its blessings. That would solve my issue, as well as allow very complex operations.
Same here. I tried a dozen times, and gave up last night. I now run the iOS iPadOS tests, but leave the macOS tests out. The file does exist in the artifacts, the tests run properly locally, but they fail to run outside my world.
This is not the first time I am getting such issue. And it's very annoying. But here is my solution. First, I add the following public disambiguation functions (most are useless, provided for completion, we're there, why not make them):
public extension BackingData {
func setPersistentModelValue<Value>(forKey key: KeyPath<Self.Model, Value>, to newValue: Value) where Value : PersistentModel {
setValue(forKey: key, to: newValue)
}
func setPersistentModelValue<Value>(forKey key: KeyPath<Self.Model, Value?>, to newValue: Value?) where Value : PersistentModel {
setValue(forKey: key, to: newValue)
}
func setRelationshipCollectionValue<Value, OtherModel>(forKey key: KeyPath<Self.Model, Value>, to newValue: Value) where Value : RelationshipCollection, OtherModel == Value.PersistentElement {
setValue(forKey: key, to: newValue)
}
func getPersistentModelValue<Value>(forKey key: KeyPath<Self.Model, Value>) -> Value where Value : PersistentModel {
getValue(forKey: key)
}
func getPersistentModelValue<Value>(forKey key: KeyPath<Self.Model, Value?>) -> Value? where Value : PersistentModel {
getValue(forKey: key)
}
func getRelationshipCollectionValue<Value, OtherModel>(forKey key: KeyPath<Self.Model, Value>) -> Value where Value : RelationshipCollection, OtherModel == Value.PersistentElement {
getValue(forKey: key)
}
}
public extension PersistentModel {
func setPersistentModelValue<Value>(forKey key: KeyPath<Self, Value>, to newValue: Value) where Value : PersistentModel {
setValue(forKey: key, to: newValue)
}
func setPersistentModelValue<Value>(forKey key: KeyPath<Self, Value?>, to newValue: Value?) where Value : PersistentModel {
setValue(forKey: key, to: newValue)
}
func setRelationshipCollectionValue<Value, OtherModel>(forKey key: KeyPath<Self, Value>, to newValue: Value) where Value : RelationshipCollection, OtherModel == Value.PersistentElement {
setValue(forKey: key, to: newValue)
}
func getPersistentModelValue<Value>(forKey key: KeyPath<Self, Value>) -> Value where Value : PersistentModel {
getValue(forKey: key)
}
func getPersistentModelValue<Value>(forKey key: KeyPath<Self, Value?>) -> Value? where Value : PersistentModel {
getValue(forKey: key)
}
func getRelationshipCollectionValue<Value, OtherModel>(forKey key: KeyPath<Self, Value>) -> Value where Value : RelationshipCollection, OtherModel == Value.PersistentElement {
getValue(forKey: key)
}
}
Then, for every type, I add up a generic getter/setter (only the setter is used in the initial value, but hey, I'm there!) for my precise type I want. To use the initial Book example:
public extension BackingData {
func setValue(forKey key: KeyPath<Self.Model, Book>, to newValue: Book) {
setPersistentModelValue(forKey: key, to: newValue)
}
func setValue(forKey key: KeyPath<Self.Model, Book?>, to newValue: Book?) {
setPersistentModelValue(forKey: key, to: newValue)
}
func getValue(forKey key: KeyPath<Self.Model, Book>) -> Book {
getPersistentModelValue(forKey: key)
}
func getValue(forKey key: KeyPath<Self.Model, Book?>) -> Book? {
getPersistentModelValue(forKey: key)
}
}
Finally, for every type that has relationships, I add up an extension to the type itself. In initial example, let's assume Trip includes some Book, as Book? or [Book]:
extension Trip {
func setValue(forKey key: KeyPath<Trip, Book>, to newValue: Book) {
setPersistentModelValue(forKey: key, to: newValue)
}
func setValue(forKey key: KeyPath<Trip, Book?>, to newValue: Book?) {
setPersistentModelValue(forKey: key, to: newValue)
}
func getValue(forKey key: KeyPath<Trip, Book>) -> Book {
getPersistentModelValue(forKey: key)
}
func getValue(forKey key: KeyPath<Trip, Book?>) -> Book? {
getPersistentModelValue(forKey: key)
}
}
This relies on the fact the swift compiler first tries non-generic versions of a function. In which case, we created some. Then, it tries to use the generic version, which is ambiguous. So we are providing the exact version it wants first. This pattern can be used for other ambiguous situations, such as the RelationshipCollection, OtherModel setters and getters. I have provided them in the generic functions too.