Post

Replies

Boosts

Views

Activity

UIViewController not handling Apple Pencil Double Tap events
I am unable to get a UIViewController to receive the UIPencilInteractionDelegate “pencilInteractionDidTap” method in response to double taps from an Apple Pencil. I am testing on iPad Pro 11 and 12.9 models, both running iOS 13.3.1I have implemented the following:extension UIViewController: UIPencilInteractionDelegate { public func pencilInteractionDidTap(_ interaction: UIPencilInteraction) { print("Handle pencil double-tap") }}In my UIViewController:override func viewDidAppear(_ animated: Bool) { super.viewDidAppear(animated) let pencilInteraction = UIPencilInteraction() pencilInteraction.delegate = self view.addInteraction(pencilInteraction)}Note: If I create a new Single View iOS app and add this code, everything works as expected.However the above code does not work with the custom UIViewController in my app. My UIViewController’s view hosts a UIScrollView as well as overlaying Container Views and handles touch events in both the View Controller and ScrollView classes.I have also tried adding the UIPencilInteraction object to other UIViewControllers in my app and the “pencilInteractionDidTap” method is not called in any of them.Do I need to change something in the Responder chain to get the UIViewController to handle “pencilInteractionDidTap” method calls?
2
0
1k
Feb ’20
CIRawFilter scaleFactor <= 0.5 creates incorrect-sized image when CIRAWFilterOption.allowDraftMode set to "true"
In iOS, I am creating a CIRAWFilter object by calling init() on the image file's URL and setting "CIRAWFilterOption.allowDraftMode : true" in the options dictionary passed in to the init call. If I set a scaleFactor = 0.5 and then call .outputImage() on the filter object, I get back a CIImage with an extent that is half the size of what I expect. If I set scaleFactor = 0.51 then I get back a CIImage with the expected extent. For example, starting with an original RAW image file of size 4,032 x 2,048: If I set the scaleFactor to 0.75 and call outputImage() on the CIRawFilter object, I get back a CIImage object with the extent 3,024 x 1,536 which has width and height at 75% of the original image's width and height. However, if I set the scaleFactor to 0.50 and call outputImage() on the CIRawFilter object, I get back an image with the extent 1,008 x 512, which is 25% of the size of the original RAW image and not 50%. If I set "allowDraftMode : false" then I always get back the correctly sized CIImage. Is this a bug or expected behavior?
1
0
882
Feb ’21
Unable to use new CIRAWFilter API in iOS 15
I am unable to initialize a CIRAWFilter object and access its instance variables using the new CIRAWFilter class API in iOS 15. If I compile and execute the following code, I get a run-time error stating, "-[CIRAWFilterImpl isGamutMappingEnabled]: unrecognized selector sent to instance 0x1065ec570'" It appears that a "CIRAWFilterImpl" object is getting created, not a "CIRAWFilter" object. I cannot cast to that type as it is unknown (most likely a private class). My code: if #available(iOS 15.0, *) { let rawFilter : CIRAWFilter = CIRAWFilter(imageURL: self.imageURL) let isGamutMappingEnabled = rawFilter.isGamutMappingEnabled print("isGamutMappingEnabled: (isGamutMappingEnabled)") }
2
0
1.3k
Jan ’22
CompositionalLayout - UICollectionView cell height not calculated based on Image height
I am using Compositional Layout plus Diffable Data Source to display requested images in a single column inside a UICollectionView from the Photos Album using PhotoKit's Cached Image Manager. I would like the image cell to be the entire width of the collection view and the height of the cell to be scaled to the height of the image using Aspect Fit content mode. When I create the layout I use estimated height for both item and group layout objects. But the initial height of each cell stays at the estimated height. As soon as I start scrolling, some of the cells do actually resize the height correctly, but not always. Here is the sample code (replace the default ViewController.swift logic in a sample iOS project with the attached Sample Code): Sample Code
0
0
775
Nov ’22
How do I properly set tagged color data in MTKView and CIContext?
I have provided a test UIKit app which displays three different images, side by side, each inside a separate MTKView. Each image is tagged with a different color profile: Display P3 uRGB Test RGB (from an image supplied in Apple's ImageApp sample). I set up default values for all color spaces and formats. I then check if the image is tagged and, if so, I override those values with state from the tagged color space. The variables I am setting: “workingColorSpace” in the Metal CIContext, default = sRGB “workingFormat” in the Metal CIContext, default = RGBAf “outputColorSpace” in the Metal CIContext, default = displayP3 “colorPixelFormat” in the MTKView, default = bgra8Unorm “colorSpace” in a CIRenderDestination that I use in the MTKView delegate draw method The “colorSpace” default value = CGColorSpaceCreateDeviceRGB() I also set “pixelFormat” in CIRenderDestination with the MTKView.colorPixelFormat. If the image is tagged, I override the following values with the tagged colorSpace: CIContext.workingColorSpace CIContext.outputColorSpace CIRenderDestination.colorSpace If the tagged colorSpace.isWideGamutRGB = true, then I set the CIRenderDestination.colorSpace to extendedSRGB, ignoring the color space in the tagged wide gamut color space, as well as set the colorPixelFormat = bgr10_xr Results: The above scenario will properly render the DisplayP3 image, and the uRGB image. The “Test RGB” image fails: If I do not override the CIRenderDestination.colorSpace with a value from the tagged image, then the “Test RGB” image succeeds, but the “uRGB” image fails to render properly: Question: Do I have everything hooked up correctly and, if so, why does one image fail, and the other succeed? Link to sample project: https://www.dropbox.com/scl/fi/57u2fcrgdvys7jtzykzxt/ColorSpaceTest.zip?rlkey=unjeeiu7mi0wx9wfpylt78nwd&dl=0
2
0
781
Mar ’24
Unable to load MKMapView from AppKit bundle in Catalyst app
I am trying to load an auxiliary window from an AppKit bundle that loads a NSViewController from a storyboard. This NSViewController displays a MKMapView. I want to avoid supporting multiple windows from my Catalyst app so I do not want to load this window via a SceneDelegate, and that is why I have chosen to go the AppKit bundle route. If I load a similar view controller which contains a single label, then all works as expected. However, if I try to load a view controller with the MKMapView, I get the following error after crashing: *** Terminating app due to uncaught exception 'NSInvalidArgumentException', reason: '-[MKMapView nsli_piercingToken]: unrecognized selector sent to instance 0x15410f600' DropBox link to sample app with buttons on a view to select either the "Map View" (crashing) or the "Label View" (working): https://www.dropbox.com/scl/fi/q9daurhzqvil9o2ry6k1t/CatalystMap.zip?rlkey=yjbqfn6uxfrfh1jgsdavagta7&dl=0
2
0
659
May ’24
How do I fetch all PHAssets from Photos Library that contain an edit (filter on hasAdjustments)?
I am trying to retrieve all PHAssets from the Photos Library that have been edited. PHAsset contains the "hasAdjustments" boolean variable but I cannot use that in an NSPredicate to filter for edited photos. What is the proper mechanism to filter out non-edited photos? Here is the code I have written that crashes: let formatString = "mediaType = %d && hasAdjustments == YES" let allPhotosOptions = PHFetchOptions() allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)] allPhotosOptions.predicate = NSPredicate(format: formatString, PHAssetMediaType.image.rawValue) allPhotosOptions.includeAssetSourceTypes = [.typeUserLibrary, .typeiTunesSynced, .typeCloudShared] return PHAsset.fetchAssets(with: allPhotosOptions) Thanks, Rick
0
1
262
Aug ’24