Post

Replies

Boosts

Views

Activity

SwiftUI glitch with coloreffect shader & orientation change
Hi, I have the following swiftUI code: Image(uiImage: image) .resizable() .aspectRatio(contentMode: .fit) .colorEffect(ShaderLibrary.AlphaConvert()) and the following shader: [[ stitchable ]] half4 AlphaConvert(float2 position, half4 currentColor) { return half4(currentColor.r>0.5,currentColor.r<=0.5,0,(currentColor.r>0.5)); } I am loading a full-res image from my photo library (24MP)... The image initially displays fine, with portions of the image red, and the rest black (due to alpha blending)... However, after rotating the device, I get an image that is a combination of red&green... Note, that the green pixels from the shader have alpha 0, hence, should never be seen. Is there something special that needs to be done on orientation changes so that the shader works fine?
0
0
156
3w
OSLog Logger & SwiftUI
Hi, Are there any macros that can be used with Logger to automatically capture the SwiftUI function that the message originates from? Currently, I am using something similar to: Logger.misc.error("In \(#fileID), \(#function): \(error)") Currently, the #function only tells me that it originates in body... Ideally, I'd like to be able to tell if it was with an onAppear, onChange, Button, alert, etc... I realize I can manually enter this, but was wondering if there was a more automated way to capture this info. Thx,
0
0
143
Dec ’24
Running out of memory analyzing images with ImageRequestHandler
Hi, I'm trying to analyze images in my Photos library with the following code: func analyzeImages(_ inputIDs: [String]) { let manager = PHImageManager.default() let option = PHImageRequestOptions() option.isSynchronous = true option.isNetworkAccessAllowed = true option.resizeMode = .none option.deliveryMode = .highQualityFormat let concurrentTasks=1 let clock = ContinuousClock() let duration = clock.measure { let group = DispatchGroup() let sema = DispatchSemaphore(value: concurrentTasks) for entry in inputIDs { if let asset=PHAsset.fetchAssets(withLocalIdentifiers: [entry], options: nil).firstObject { print("analyzing asset: \(entry)") group.enter() sema.wait() manager.requestImage(for: asset, targetSize: PHImageManagerMaximumSize, contentMode: .aspectFit, options: option) { (result, info) in if let result = result { Task { print("retrieved asset: \(entry)") let aestheticsRequest = CalculateImageAestheticsScoresRequest() let fingerprintRequest = GenerateImageFeaturePrintRequest() let inputImage = result.cgImage! let handler = ImageRequestHandler(inputImage) let (aesthetics,fingerprint) = try await handler.perform(aestheticsRequest, fingerprintRequest) // save Results print("finished asset: \(entry)") sema.signal() group.leave() } } else { group.leave() } } } } group.wait() } print("analyzeImages: Duration \(duration)") } When running this code, only two requests are being processed simultaneously (due to to the semaphore)... However, if I call the function with a large list of images (>100), memory usage balloons over 1.6GB and the app crashes. If I call with a smaller number of images, the loop completes and the memory is freed. When I use instruments to look for memory leaks, it indicates no memory leaks are found, but there are 150+ VM:IOSurfaces allocated by CMPhoto, CoreVideo and CoreGraphics @ 35MB each. Shouldn't each surface be released when the task is complete?
2
0
368
Nov ’24