Post

Replies

Boosts

Views

Activity

Burst Mode Using Bracketed Capture?
Eventhough some apps out there figured out how to do proper burst capture that is equal to stock iOS Camera App, there is no official API to do so.Currently I have it where it wil call capturePhoto over and over, which is not fast enough. I've seen some hints floating around that the proper way to do this is using bracketed capture any input on this?Thanks in advance.
1
1
3.2k
Oct ’17
Capture Image Orientation
This is regarding image orientation when captured with the device rotation lock on.'self.photoOutput.connection(with: AVMediaType.video)?.videoOrientation = self.stillImageOrientationI have self.stillImageOrientation listening to status bar orientationUIApplication.shared.statusBarOrientationHowever, when the user activates device rotation lock and rotates the device, captured photos get saved in the wrong orientation.I expected this might solve the issue:UIDevice.current.orientationThat also stops changing after the device rotation lock.I noticed the native camera app still registers proper orientation even after the device rotation lock. Is there a way to achieve this?
0
0
874
Apr ’20
WidgetKit Memory Issue - Photo Asset Dimensions
With iOS 14 WidgetKit, I am trying to display a widget with a 'particular' photo asset from the library. With a medium-size widget, I want the photo asset to fill the whole area (Aspect Fill), just like the native Photos widget. struct My_WidgetEntryView : View { var entry : Provider.Entry var body: some View { PhotoView(user: 1) } } struct PhotoView: View { var user : Int @ObservedObject private var assetImageModel = AssetImageModel() init(user: Int) { self.user = user self.assetImageModel.loadAsset() } var body: some View { if self.assetImageModel.assetImage != nil { Image(uiImage: self.assetImageModel.assetImage!) .resizable() } else { Image(uiImage: UIImage(systemName: "photo")!) .resizable() } } } class AssetImageModel: ObservableObject { @Published var assetImage: UIImage? init() { loadAsset() } func loadAsset() { if PHPhotoLibrary.authorizationStatus() == .authorized { let allPhotosOptions = PHFetchOptions() allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)] let assetsFetchResults = PHAsset.fetchAssets(with: allPhotosOptions) if assetsFetchResults.count > 0 { let asset = assetsFetchResults.lastObject if asset != nil { let options = PHImageRequestOptions() options.isSynchronous = true options.deliveryMode = .highQualityFormat options.resizeMode = .exact options.version = .current PHCachingImageManager().requestImage(for: asset!, targetSize: CGSize(width: 360 * UIScreen.main.scale, height: 170 * UIScreen.main.scale), contentMode: .aspectFill, options: options, resultHandler: { image, _ in //DispatchQueue.main.async { self.assetImage = image //} }) } else { //Asset was nil } } else { //No assets } } else { //No persmission } } } When the above is run, I get a crash with a memory warning stating allowed 30 MB memory limit has exceeded. This obvious due to this: For the medium-size widget, I had to go by a 360 x 170 point size since there was no way to get the actual view dimensions before the the view gets displayed. The actual target size then has to be multiplied by the screen scale, at the moment the test device is a 3x. Aspect Fill as to be the fill type. Due to this, PHCachingImageManageris returning a 360-point width with 480-point height for a portrait photo in order to fill the view. The final dimensions of the image end up being 1080 x 1440 pixel size. I imagine this is the reason for memory pressure. Question Is there a way to ask for a properly cropped 360 x 170 point size image? The stock Photos widget seems to manage this. Things I Tried I tried normalizedCropRect in PHImageRequestOptions and had no luck. Lowering the image size by a huge margin works, but the image looks blurred visually.
1
0
1.8k
Sep ’20
WidgetKit Crash
On the crash reports, I am getting a lot of crashes with this crash report for two iOS 14 Widgets: Foundation&#9;&#9;_NSFileHandleRaiseOperationException .... SwiftUI&#9;&#9;&#9; FileArchiveWriter.AppendByes(_size:) .... My Widget&#9;&#9;&#9; closure #1 in MyTimelineProver.getTimeline(in:completion:) When checked within the project, this happens in: func getTimeline(in context: Self.Context, completion: @escaping (Timeline<Self.Entry>) -> Void) I have one timeline that asked to update every 5 mins.
7
0
3.2k
Oct ’20
AVCaptureSession Pausing Session
With the new camera privacy indicator, there is a new need to pause the session at certain times, ie when the full-screen view is presented over the camera preview. This was the only solution for me: func pauseSession () { self.sessionQueue.async { if self.session.isRunning { self.session.stopRunning() } } } func resumeSession () { self.sessionQueue.async { if !self.session.isRunning { self.session.startRunning() } } } This seems to be an expensive operation that takes time. The issue I seem to have is if pause and resume are called near each other in time, the whole app freezes for about 10 seconds, till going back to being responsive. This is mostly due to it still hasn't finished the last process (whether to stop or start). Is there a solution to this? The native camera app seems to do this fine. If you open it, open the last photo, you can see the green indicator on the top right going off (after becoming orange briefly), meaning the session has paused/stopped. If you swipe down on the photo, the session resumes. If you swipe and let it get canceled, quickly swipe again you can see the session pauses and resumes quickly over and over without any issues.
0
1
2.1k
Mar ’21
App Privacy Listing and Crash Reports
Currently, I have an app that does not collect any user data. So the App privacy listing on the App Store is listed as "Does not collect any data." However, it still receives app crashes and other reports that are available via Xcode Organizer. I believe the user enables this via the prompt "Share analytics, diagnostics, and usage information with Apple" at the system level. So given above, do I need to change the listing to say it explicitly collects crash data?
1
0
1.8k
May ’21
API to use New SF Pro Condensed/Compressed/Expanded
New variants of SF Pro were introduced for iOS 16, but I do not see an update to UIFont API to use them. https://developer.apple.com/documentation/uikit/uifont Another thread on here specified that it could be done through UIFontDescriptor, but it doesn't seem like thee that is the official way. I really wish there is more information on this. I am wanting to use Expanded for my iOS update.
1
1
1.7k
Aug ’22
Stop viewSafeAreaInsetsDidChange() being called
I have a child view controller added and its view gets viewSafeAreaInsetsDidChange() called every time a frame change happens. how do I avoid this? So far I am using these: self.viewRespectsSystemMinimumLayoutMargins = false self.view.insetsLayoutMarginsFromSafeArea = false self.view.preservesSuperviewLayoutMargins. = false However, viewSafeAreaInsetsDidChange() is till being called. Is there a way to stop that?
3
0
681
Jul ’23
UIImageView preferredImageDynamicRange not working
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video let imageView = UIImageView() if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } self.imageView.clipsToBounds = true self.imageView.isMultipleTouchEnabled = true self.imageView.contentMode = .scaleAspectFit self.photoScrollView.addSubview(self.imageView) I pull the image from PHImageManager: let options = PHImageRequestOptions() options.deliveryMode = .highQualityFormat options.isNetworkAccessAllowed = true PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in guard let image = image else { return } DispatchQueue.main.async { self.imageView.image =image if #available(iOS 17.0, *) { self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high } } } Issue The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app. What am I missing here?
1
0
898
Sep ’23