Awesome! Thank you very much.
Post
Replies
Boosts
Views
Activity
I had gathered that a new version wasn't available for the DTK, but it seems like that should have been prominently mentioned in the release notes.
Screenshots are on SO - https://stackoverflow.com/questions/67550694/xcode-image-quick-look-is-not-working-on-m1-mac.
If I had to guess, I think that the issue is something that has worked for iOS 13 and before is no longer properly working with either iOS 15 SDK or possibly iOS 14 SDK.
Building for iOS 13 then building for iOS 15 works, because I suspect that something is actually being cached when I build for iOS 13 that should not be cached after deleting the derived files. So I fear the “correct” behavior is for it to complain about something and not successfully compile at all using the iOS 15 SDK. I’m pretty much forced to only support iOS 15 because of a severe issue that it resolved with CoreML or I would be glad to just leave it at iOS 13 and shrug it off.
The most suspicious difference is in iOS 13 Float16 was not supported yet so I am using Uint8 with a Float16 alias.
I tried commenting out by Float16 alias (which just pointed to UInt8) and to use the SDK for Float16 instead, but it still had the same issue of taking 30 seconds to compile the shader and hanging.
Can anyone tell me what iOS uses under the hood for to represent Float16? Does it have the same alignment as UInt8?
For all of by MTLBuffers and parameters I use MemoryLayout’s Stride. Should I be using Size instead of Stride for iOS 15?
All the metal structures are defined in a C header which is suppose to enforce Metal compatibility as I understand it.
It turns out that the slow compile only occurs (intermittently) when Shader Validation is enabled. If I disable Shader Validation it compiles as expected.
When compiling for iOS 13, since Shader Validation is not supported, the slow compile would not occur since it was not being validated.
As for the runtime hang, it turns out that I was indeed accessing a buffer out-of-bounds. Why it was never an issue before I can't say, but it was my bad. I was just super suspicious of the build since it was taking abnormally long to compile.
As for the apparent bug with Shader Validation, I have updated my FeedBack Assistant report with sysdiagnose reports.
As I mentioned, I need a way for the user to save multi-gigabyte images that are too big to be used with UIKit. Like the ability to copy the full resolution image with AirDrop.
I wish the user could use the standard "More actions" with the full resolution image, but it may be too big to use a UIImage etc. So when images larger than the largest possible UIImage, there will be a way to copy the full res image, and standard sharing options for the thumbnail image.
As I mentioned nothing happens without the .zip or .bin extension and no errors reported. It just doesn't do anything. Interestingly, if I make the app's Documents available in Files, Files does the same thing - it just silently fails trying to do anything with the image, but it works if the image has a .zip or .bin extension.
By the way, I have not been able to find out the largest supported pixel dimensions for UIImages, and would be helpful to know. I can do trial and error, but I don't know if they might have different supported maximum sizes based on particular devices.
Like if try to load a 50K image (1.8 billion pixels) as a UIImage on a M1 iPad Pro, it crashes. The compressed image is about 1GB. Larger images are supported.
I figured it out; apparently flexible shapes do not run on the ANE.
I really wish this was documented; the docs just state to use enumerated shapes for best performance.
But in this case, using flexible shapes is nearly 10 times slower and I don't understand why they are supported at all with that kind of penalty.
It would have saved me much trouble not having flexible shapes since I now need to refactor inference in shipped products. Good chance that is why one of the products I spent six months of my life developing has largely been a flop. Very frustrating.
When using any vector overlays (MKPolygonRenderer, MKPolylineRenderer, etc), my hack of using .aboveLabels and .aboveRoads doesn't work. When a raster layer uses .aboveRoads and the vector uses .aboveLabels, the vectors are not drawn properly. Since I need at least one vector layer with normal blending, it means I effectively have zero blend modes other than normal for raster overlays.
I created a minimal reproducible project. Clicking the button will add a MKPolygonRenderer with normal blending that breaks the raster overlay that uses .softLight blending.
Note that I have tried overriding the draw functions for the overlays and setting the blend modes there - even tried with transparency layers, but I have not found a work-around.
I did it like this:
(1) in mapView(_ mapView: MKMapView, viewFor annotation: MKAnnotation) add your MKAnnotationViews to an array if doesn't exist in array.
(2) in mapViewDidChangeVisibleRegion(_ mapView: MKMapView) loop over the array and adjust as needed based on scale calculated from the mapView region.
One more source of confusion for me, hopefully someone can clarify, is the headers state Image Playground is available with iOS 18.1, but articles I've read state it is new for iOS 18.2. So what is the correct requirement?
I don't understand the mystery. As a developer, I don't even know what devices will work with the new features I worked hard to develop.
That seems like an issue. I'm just asking for a little guidance because I cannot find this documented anywhere. Please help.
I tried pausing the runtime when waiting the ~38 seconds shows it is running an internal Metal command on the Main Thread
Even though I simplified the code I was using for testing purposes the sample code I posted, I created an absolute bare bones MKTileOverlay and the same issue is not occurring.
I was going to delete this post until I can find out exactly what is causing the issue, but I don't see how.
Thanks, Ed. As I mentioned, I don't know how to recreate it using a bare-bones implementation like the sample code. I am using a framework called MapCache that used to work before sometime around iOS 18, but for the video I shot, I forced it to use the simple code I posted. If I can figure out what is causing it, I will post a sample project.
I encountered this issue while trying to work around the reproducible issues FB16009863 and FB14553276. But when I use simple code that doesn't cause the issue reported above, it causes the reproducible issue FB13989005 - which causes the entire screen to flicker every time a zoom level is changed with my very simplified code (provided with that FB). Which is based on Apple's sample code. Sadly it makes zooming a rather jarring experience. The only thing I can do to minimize it is to disable the .canReplaceContent property, but then it downloads and renders Apple's maps which will be completely covered and so is a waste of bandwidth and processing/energy consumption.