Post

Replies

Boosts

Views

Activity

Checking authorization status of AVCaptureDevice or CLLocation Manager gives runtime warnings in iOS 18
I have the following code in my ObservableObject class and recently XCode started giving purple coloured runtime issues with it (probably in iOS 18): Issue 1: Performing I/O on the main thread can cause slow launches. Issue 2: Interprocess communication on the main thread can cause non-deterministic delays. Issue 3: Interprocess communication on the main thread can cause non-deterministic delays. Here is the code: @Published var cameraAuthorization:AVAuthorizationStatus @Published var micAuthorization:AVAuthorizationStatus @Published var photoLibAuthorization:PHAuthorizationStatus @Published var locationAuthorization:CLAuthorizationStatus var locationManager:CLLocationManager override init() { // Issue 1 (Performing I/O on the main thread can cause slow launches.) cameraAuthorization = AVCaptureDevice.authorizationStatus(for: AVMediaType.video) micAuthorization = AVCaptureDevice.authorizationStatus(for: AVMediaType.audio) photoLibAuthorization = PHPhotoLibrary.authorizationStatus(for: .addOnly) //Issue 1: Performing I/O on the main thread can cause slow launches. locationManager = CLLocationManager() locationAuthorization = locationManager.authorizationStatus super.init() //Issue 2: Interprocess communication on the main thread can cause non-deterministic delays. locationManager.delegate = self } And also in route Change notification handler of AVAudioSession.routeChangeNotification, //Issue 3: Hangs - Interprocess communication on the main thread can cause non-deterministic delays. let categoryPlayback = (AVAudioSession.sharedInstance().category == .playback) I wonder how checking authorisation status can give these issues? What is the fix here?
1
0
180
2w
Tap Gesture on Subview disables drag gesture on super view
I have the following two views in SwiftUI. The first view GestureTestView has a drag gesture defined on its overlay view (call it indicator view) and has the subview called ContentTestView that has tap gesture attached to it. The problem is tap gesture in ContentTestView is blocking Drag Gesture on indicator view. I have tried everything including simultaneous gestures but it doesn't seem to work as gestures are on different views. It's easy to test by simply copying and pasting the code and running the code in XCode preview. import SwiftUI struct GestureTestView: View { @State var indicatorOffset:CGFloat = 10.0 var body: some View { ContentTestView() .overlay(alignment: .leading, content: { Capsule() .fill(Color.mint.gradient) .frame(width: 8, height: 60) .offset(x: indicatorOffset ) .gesture( DragGesture(minimumDistance: 0) .onChanged({ value in indicatorOffset = min(max(0, 10 + value.translation.width), 340) }) .onEnded { value in } ) }) } } #Preview { GestureTestView() } struct ContentTestView: View { @State var isSelected = false var body: some View { HStack(spacing:0) { ForEach(0..<8) { index in Rectangle() .fill(index % 2 == 0 ? Color.blue : Color.red) .frame(width:40, height:40) } .overlay { if isSelected { RoundedRectangle(cornerRadius: 5) .stroke(.yellow, lineWidth: 3.0) } } } .onTapGesture { isSelected.toggle() } } } #Preview { ContentTestView() }
1
0
268
Oct ’24
Selecting Metal 3.2 as language causes crash on iPhone 11 Pro (iOS 17.1.1)
XCode 16 seems to have an issue with stitchable kernels in Core Image which gives build errors as stated in this question. As a workaround, I selected Metal 3.2 as Metal Language Revision in XCode project. It works on newer devices like iPhone 13 pro and above but metal texture creation fails on older devices like iPhone 11 pro. Is this a known issue and is there a workaround? I tried selecting Metal language revision to 2.4 but the same build errors occur as reported in this question. Here is the code where assertion failure happens on iPhone 11 Pro. let vertexShader = library.makeFunction(name: "vertexShaderPassthru") let fragmentShaderYUV = library.makeFunction(name: "fragmentShaderYUV") let pipelineDescriptorYUV = MTLRenderPipelineDescriptor() pipelineDescriptorYUV.rasterSampleCount = 1 pipelineDescriptorYUV.colorAttachments[0].pixelFormat = .bgra8Unorm pipelineDescriptorYUV.depthAttachmentPixelFormat = .invalid pipelineDescriptorYUV.vertexFunction = vertexShader pipelineDescriptorYUV.fragmentFunction = fragmentShaderYUV do { try pipelineStateYUV = metalDevice?.makeRenderPipelineState(descriptor: pipelineDescriptorYUV) } catch { assertionFailure("Failed creating a render state pipeline. Can't render the texture without one.") return }
3
0
361
Sep ’24
Unable to build Core Image kernels in XCode 16
My app is suddenly broken when I build it with XCode 16. It seems Core Image kernels compilation is broken in XCode 16. Answers on StackOverflow seem to suggest we need to use a downgraded version of Core Image framework as a workaround, but I am not sure if there is a better solution out there. FYI, I am using [[ stitchable ]] kernels and I see projects having stitchable are the ones showing issue. air-lld: error: symbol(s) not found for target 'air64_v26-apple-ios17.0.0' metal: error: air-lld command failed with exit code 1 (use -v to see invocation) Showing Recent Messages /Users/Username/Camera4S-Swift/air-lld:1:1: ignoring file '/Applications/Xcode.app/Contents/Developer/Platforms/iPhoneOS.platform/Developer/SDKs/iPhoneOS.sdk/System/Library/Frameworks/CoreImage.framework/CoreImage.metallib', file AIR version (2.7) is bigger than the one of the target being linked (2.6)
1
2
299
Sep ’24
SwiftUI keep background view stationary as main view resized
I have implemented a sample video editing timeline using SwiftUI and am facing issues. So I am breaking up the problem in chunks and posting issue each as a separate question. In the code below, I have a simple timeline using an HStack comprising of a left spacer, right spacer(represented as simple black color) and a trimmer UI in the middle. The trimmer resizes as the left and right handles are dragged. The left and right spacers also adjust in width as the trimmer handles are dragged. Problem: I want to keep the background thumbnails (implemented currently as simple Rectangles filled in different colors) in the trimmer stationary as the trimmer resizes. Currently they move along as the trimmer resizes as seen in the gif below. How do I fix it? import SwiftUI struct SampleTimeline: View { let viewWidth:CGFloat = 340 //Width of HStack container for Timeline @State var frameWidth:CGFloat = 280 //Width of trimmer var minWidth: CGFloat { 2*chevronWidth + 10 } //min Width of trimmer @State private var leftViewWidth:CGFloat = 20 @State private var rightViewWidth:CGFloat = 20 var chevronWidth:CGFloat { return 24 } var body: some View { HStack(spacing:0) { Color.black .frame(width: leftViewWidth) .frame(height: 70) HStack(spacing: 0) { Image(systemName: "chevron.compact.left") .frame(width: chevronWidth, height: 70) .background(Color.blue) .gesture( DragGesture(minimumDistance: 0) .onChanged({ value in leftViewWidth = max(leftViewWidth + value.translation.width, 0) if leftViewWidth > viewWidth - minWidth - rightViewWidth { leftViewWidth = viewWidth - minWidth - rightViewWidth } frameWidth = max(viewWidth - leftViewWidth - rightViewWidth, minWidth) }) .onEnded { value in } ) Spacer() Image(systemName: "chevron.compact.right") .frame(width: chevronWidth, height: 70) .background(Color.blue) .gesture( DragGesture(minimumDistance: 0) .onChanged({ value in rightViewWidth = max(rightViewWidth - value.translation.width, 0) if rightViewWidth > viewWidth - minWidth - leftViewWidth { rightViewWidth = viewWidth - minWidth - leftViewWidth } frameWidth = max(viewWidth - leftViewWidth - rightViewWidth, minWidth) }) .onEnded { value in } ) } .foregroundColor(.black) .font(.title3.weight(.semibold)) .background { HStack(spacing:0) { Rectangle().fill(Color.red) .frame(width: 70, height: 60) Rectangle().fill(Color.cyan) .frame(width: 70, height: 60) Rectangle().fill(Color.orange) .frame(width: 70, height: 60) Rectangle().fill(Color.brown) .frame(width: 70, height: 60) Rectangle().fill(Color.purple) .frame(width: 70, height: 60) } } .frame(width: frameWidth) .clipped() Color.black .frame(width: rightViewWidth) .frame(height: 70) } .frame(width: viewWidth, alignment: .leading) } } #Preview { SampleTimeline() }
1
0
219
Sep ’24
SwiftUI video editing timeline implementation
I am trying to build a video editing timeline using SwiftUI which consists of a series of trimmers (sample code for trimmer below). I plan to embed multiple of these trimmers in an HStack to make an editing timeline (HStack in turn will be embedded in a ScrollView). What I want to achieve is that if I trim end of any of these trimmers by dragging it's left/right edge, the other trimmers on timeline should move left/right to fill the gap. I understand as the view shrinks/expands during trimming, there might be a need for spacers on the edges of HStack whose widths need to be adjusted while trimming is ongoing. Right now the code I have for the trimmer uses a fixed frameWidth of the view, so the view occupies full space even if the trimming ends move. It's not clear how to modify my code below to achieve what I want. This was pretty much possible to do in UIKit by the way. import SwiftUI struct SimpleTrimmer: View { @State var frameWidth:CGFloat = 300 let minWidth: CGFloat = 30 @State private var leftOffset: CGFloat = 0 @State private var rightOffset: CGFloat = 0 @GestureState private var leftDragOffset: CGFloat = 0 @GestureState private var rightDragOffset: CGFloat = 0 private var leftAdjustment: CGFloat { // NSLog("Left offset \(leftOffset + leftDragOffset)") var adjustment = max(0, leftOffset + leftDragOffset) if frameWidth - rightOffset - adjustment - 60 < minWidth { adjustment = frameWidth - rightOffset - minWidth - 60.0 } return adjustment } private var rightAdjustment: CGFloat { var adjustment = max(0, rightOffset - rightDragOffset) if frameWidth - adjustment - leftOffset - 60 < minWidth { adjustment = frameWidth - leftOffset - 60 - minWidth } return adjustment } var body: some View { HStack(spacing: 10) { Image(systemName: "chevron.compact.left") .frame(width: 30, height: 70) .background(Color.blue) .offset(x: leftAdjustment) .gesture( DragGesture(minimumDistance: 0) .updating($leftDragOffset) { value, state, trans in state = value.translation.width } .onEnded { value in var maxLeftOffset = max(0, leftOffset + value.translation.width) if frameWidth - rightAdjustment - maxLeftOffset - 60 < minWidth { maxLeftOffset = frameWidth - rightAdjustment - minWidth - 60 } leftOffset = maxLeftOffset } ) Spacer() Image(systemName: "chevron.compact.right") .frame(width: 30, height: 70) .background(Color.blue) .offset(x: -rightAdjustment) .gesture( DragGesture(minimumDistance: 0) .updating($rightDragOffset) { value, state, trans in state = value.translation.width } .onEnded { value in var minRightOffset = max(0, rightOffset - value.translation.width) if minRightOffset < leftAdjustment - 60 - minWidth { minRightOffset = leftAdjustment - 60 - minWidth } rightOffset = minRightOffset } ) } .foregroundColor(.black) .font(.title3.weight(.semibold)) .padding(.horizontal, 7) .padding(.vertical, 3) .background { RoundedRectangle(cornerRadius: 7) .fill(.yellow) .padding(.leading, leftAdjustment) .padding(.trailing, rightAdjustment) } .frame(width: frameWidth) } } #Preview { SimpleTrimmer() }
0
0
319
Sep ’24
SwiftUI view body invoked in infinite loop
I am observing infinite loop of view creation, deletion, and recreation when I move my app to background and bring back to foreground. I am clueless what is causing this repeated invocation of view body, so I tried Self._printChanges() inside the view body . But all I get is the following in console. ChildView: @ self , _dismiss changed. //<--- This is the problematic view ParentView: unchanged. //<--- This is parent view The issue has also been reported on Apple developer forums but no solution found. What are other options to debug this issue?
1
1
346
Jul ’24
Making ScrollView based discrete scrubber in SwiftUI
I am trying to recreate Discrete scrubber in SwiftUI with haptic feedback and snap to nearest integer step. I use ScrollView and LazyHStack as follows: struct DiscreteScrubber: View { @State var numLines:Int = 100 var body: some View { ScrollView(.horizontal, showsIndicators: false) { LazyHStack { ForEach(0..<numLines, id: \.self) { _ in Rectangle().frame(width: 2, height: 10, alignment: .center) .foregroundStyle(Color.red) Spacer().frame(width: 10) } } } } } Problem: I need to add content inset of half the frame width of ScrollView so that the first line in the scrubber starts at the center of the view and so does the last line, and also generate haptic feedback as it scrolls. This was easy in UIKit but not obvious in SwiftUI.
0
0
519
Feb ’24
AppTransaction issues for previous purchases under VPP
Dear StoreKit Engineers, I recently migrated my app to freemium model from paid and am using AppTransaction to get original purchase version and original purchase date to determine if user already paid for the app before or not. It seems to be working for normal AppStore users but I am now flooded with complains from VPP users who previously purchased the app. It seems AppTransaction history is absent for these users and these users are asked to sign-in using Apple ID. What is the solution here?
1
0
581
Jan ’24
SwiftUI Video Player autorotation issue
I am embedding SwiftUI VideoPlayer in a VStack and see that the screen goes black (i.e. the content disappears even though video player gets autorotated) when the device is rotated. The issue happens even when I use AVPlayerViewController (as UIViewControllerRepresentable). Is this a bug or I am doing something wrong? var videoURL:URL let player = AVPlayer() var body: some View { VStack { VideoPlayer(player: player) .frame(maxWidth:.infinity) .frame(height:300) .padding() .ignoresSafeArea() .background { Color.black } .onTapGesture { player.rate = player.rate == 0.0 ? 1.0 : 0.0 } Spacer() } .ignoresSafeArea() .background(content: { Color.black }) .onAppear { let audioSession = AVAudioSession.sharedInstance() do { try audioSession.setCategory(AVAudioSession.Category.playback, mode: AVAudioSession.Mode.default, options: AVAudioSession.CategoryOptions.duckOthers) } catch { NSLog("Unable to set session category to playback") } let playerItem = AVPlayerItem(url: videoURL) player.replaceCurrentItem(with: playerItem) } }
0
0
750
Jan ’24
AVSampleBufferDisplayLayer vs CAMetalLayer for displaying HDR
I have been using MTKView to display CVPixelBuffer from the camera. I use so many options to configure color space of the MTKView/CAMetalLayer that may be needed to tonemap content to the display (CAEDRMetadata for instance). If however I use AVSampleBufferDisplayLayer, there are not many configuration options for color matching. I believe AVSampleBufferDisplayLayer uses pixel buffer attachments to determine the native color space of the input image and does the tone mapping automatically. Does AVSampleBufferDisplayLayer have any limitations compared to MTKView, or both can be used without any compromise on functionality?
0
0
745
Nov ’23
XCode 15 freezes so many times on Intel based Macbook Pro
The title says it all, XCode 15 freezes in so many cases that it has to be force quit and reopened, be it opening Swift packages or recognising connected devices. This happens on Intel based 2018 Macbook Pro. Is it the problem on Intel based Macs or XCode 15 is super buggy on all devices? For instance, I can not open and build this project downloaded from Github, specifically opening the file named MetalViewUI.swift in the project, it hangs forever.
0
0
718
Nov ’23
Correctly process HDR in Metal Core Image Kernels (& Metal)
I am trying to carefully process HDR pixel buffers (10-bit YCbCr buffers) from the camera. I have watched all WWDC videos on this topic but have some doubts expressed below. Q. What assumptions are safe to make about sample values in Metal Core Image Kernels? Are the sample values received in Metal Core Image kernel linear or gamma corrected? Or does that depend on workingColorSpace property, or the input image that is supplied (though imageByMatchingToColorSpace() API, etc.)? And what could be the max and min values of these samples in either case? I see that setting workingColorSpace to NSNull() in context creation options will guarantee receiving the samples as is and normalised to [0-1]. But then it's possible the values are non-linear gamma corrected, and extracting linear values would involve writing conversion functions in the shader. In short, how do you safely process HDR pixel buffers received from the camera (which are in YCrCr420_10bit, which I believe have gamma correction applied, so Y in YCbCr is actually Y'. Can AVFoundation team clarify this?) ?
0
0
835
Nov ’23
Adding multiple AVCaptureVideoDataOutput stalls captureSession
Adding multiple AVCaptureVideoDataOutput is officially supported in iOS 16 and works well, except for certain configurations such as ProRes (YCbCr422 pixel format) where session fails to start if two VDO outputs are added. Is this a known limitation or a bug? Here is the code: device.activeFormat = device.findFormat(targetFPS, resolution: targetResolution, pixelFormat: kCVPixelFormatType_422YpCbCr10BiPlanarVideoRange)! NSLog("Device supports tone mapping \(device.activeFormat.isGlobalToneMappingSupported)") device.activeColorSpace = .HLG_BT2020 device.activeVideoMinFrameDuration = CMTime(value: 1, timescale: CMTimeScale(targetFPS)) device.activeVideoMaxFrameDuration = CMTime(value: 1, timescale: CMTimeScale(targetFPS)) device.unlockForConfiguration() self.session?.addInput(input) let output = AVCaptureVideoDataOutput() output.alwaysDiscardsLateVideoFrames = true output.setSampleBufferDelegate(self, queue: self.samplesQueue) if self.session!.canAddOutput(output) { self.session?.addOutput(output) } let previewVideoOut = AVCaptureVideoDataOutput() previewVideoOut.alwaysDiscardsLateVideoFrames = true previewVideoOut.automaticallyConfiguresOutputBufferDimensions = false previewVideoOut.deliversPreviewSizedOutputBuffers = true previewVideoOut.setSampleBufferDelegate(self, queue: self.previewQueue) if self.session!.canAddOutput(previewVideoOut) { self.session?.addOutput(previewVideoOut) } self.vdo = vdo self.previewVDO = previewVideoOut self.session?.startRunning() It works for other formats such as 10-bit YCbCr video range HDR sample buffers, but there are lot of frame drops when recording with AVAssetWriter at 4K@60 fps. Are these known limitations or bad use of API?
1
0
699
Nov ’23
Metal Core Image kernel workingColorSpace
I understand that by default, Core image uses extended linear sRGB as default working color space for executing kernels. This means that the color values received (or sampled from sampler) in the Metal Core Image kernel are linear values without gamma correction applied. But if we disable color management by setting let options:[CIContextOption:Any] = [CIContextOption.workingColorSpace:NSNull()]; do we receive color values as it exists in the input texture (which may have gamma correction already applied)? In other words, the color values received in the kernel are gamma corrected and we need to manually convert them to linear values in the Metal kernel if required?
0
0
683
Nov ’23