Eventhough some apps out there figured out how to do proper burst capture that is equal to stock iOS Camera App, there is no official API to do so.Currently I have it where it wil call capturePhoto over and over, which is not fast enough. I've seen some hints floating around that the proper way to do this is using bracketed capture any input on this?Thanks in advance.
Post
Replies
Boosts
Views
Activity
Previously, I have had screenshots and Display P3 color space embedded, and they looked fine on device.I recently, switched to Photoshop and exported them in Display P3 and they look washed out on AppStore on device. Is there something I am missing here?What is the common practice here?
I've been using keywords separated by commas WITHOUT spaces.What is the recommended method? WITH or WITHOUT spaces? Or there is no difference?
This is regarding image orientation when captured with the device rotation lock on.'self.photoOutput.connection(with: AVMediaType.video)?.videoOrientation = self.stillImageOrientationI have self.stillImageOrientation listening to status bar orientationUIApplication.shared.statusBarOrientationHowever, when the user activates device rotation lock and rotates the device, captured photos get saved in the wrong orientation.I expected this might solve the issue:UIDevice.current.orientationThat also stops changing after the device rotation lock.I noticed the native camera app still registers proper orientation even after the device rotation lock. Is there a way to achieve this?
Using Apple Pencil's Double Tap Gesture within your app, which is not a drawing app, allowed? For example, trigger an action within the app?After reading Apple's Human Interface Guidelines, it still seems to be somewhat ambiguous. Thanks in advance.
After the most recent AppStoreConnect update, for my iOS app, on the left side, it is asking for macOS version to be submitted in addition to iOS version.
How would I take this option out?
With iOS 14 WidgetKit, I am trying to display a widget with a 'particular' photo asset from the library. With a medium-size widget, I want the photo asset to fill the whole area (Aspect Fill), just like the native Photos widget.
struct My_WidgetEntryView : View
{
var entry : Provider.Entry
var body: some View
{
PhotoView(user: 1)
}
}
struct PhotoView: View
{
var user : Int
@ObservedObject private var assetImageModel = AssetImageModel()
init(user: Int)
{
self.user = user
self.assetImageModel.loadAsset()
}
var body: some View {
if self.assetImageModel.assetImage != nil {
Image(uiImage: self.assetImageModel.assetImage!)
.resizable()
} else {
Image(uiImage: UIImage(systemName: "photo")!)
.resizable()
}
}
}
class AssetImageModel: ObservableObject {
@Published var assetImage: UIImage?
init() {
loadAsset()
}
func loadAsset() {
if PHPhotoLibrary.authorizationStatus() == .authorized {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let assetsFetchResults = PHAsset.fetchAssets(with: allPhotosOptions)
if assetsFetchResults.count > 0 {
let asset = assetsFetchResults.lastObject
if asset != nil {
let options = PHImageRequestOptions()
options.isSynchronous = true
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.version = .current
PHCachingImageManager().requestImage(for: asset!, targetSize: CGSize(width: 360 * UIScreen.main.scale, height: 170 * UIScreen.main.scale), contentMode: .aspectFill, options: options, resultHandler: { image, _ in
//DispatchQueue.main.async {
self.assetImage = image
//}
})
} else {
//Asset was nil
}
} else {
//No assets
}
} else {
//No persmission
}
}
}
When the above is run, I get a crash with a memory warning stating allowed 30 MB memory limit has exceeded. This obvious due to this:
For the medium-size widget, I had to go by a 360 x 170 point size since there was no way to get the actual view dimensions before the the view gets displayed.
The actual target size then has to be multiplied by the screen scale, at the moment the test device is a 3x. Aspect Fill as to be the fill type. Due to this, PHCachingImageManageris returning a 360-point width with 480-point height for a portrait photo in order to fill the view. The final dimensions of the image end up being 1080 x 1440 pixel size. I imagine this is the reason for memory pressure.
Question
Is there a way to ask for a properly cropped 360 x 170 point size image? The stock Photos widget seems to manage this.
Things I Tried I tried normalizedCropRect in PHImageRequestOptions and had no luck.
Lowering the image size by a huge margin works, but the image looks blurred visually.
Is it possible for a widget to be reloaded when the photo library changes?
Outside of Widgets, there is
PHPhotoLibraryChangeObserver
Is it possible to use something similar to update the timeline whenever there is a change in the photo library?
TimelineProvider.getTimeline(in:completion:)
Any idea on when AppleRAW API's would be available?
On the crash reports, I am getting a lot of crashes with this crash report for two iOS 14 Widgets:
Foundation		_NSFileHandleRaiseOperationException
....
SwiftUI			 FileArchiveWriter.AppendByes(_size:)
....
My Widget			 closure #1 in MyTimelineProver.getTimeline(in:completion:)
When checked within the project, this happens in:
func getTimeline(in context: Self.Context, completion: @escaping (Timeline<Self.Entry>) -> Void)
I have one timeline that asked to update every 5 mins.
With the new camera privacy indicator, there is a new need to pause the session at certain times, ie when the full-screen view is presented over the camera preview.
This was the only solution for me:
func pauseSession () {
self.sessionQueue.async {
if self.session.isRunning {
self.session.stopRunning()
}
}
}
func resumeSession () {
self.sessionQueue.async {
if !self.session.isRunning {
self.session.startRunning()
}
}
}
This seems to be an expensive operation that takes time.
The issue I seem to have is if pause and resume are called near each other in time, the whole app freezes for about 10 seconds, till going back to being responsive. This is mostly due to it still hasn't finished the last process (whether to stop or start).
Is there a solution to this?
The native camera app seems to do this fine. If you open it, open the last photo, you can see the green indicator on the top right going off (after becoming orange briefly), meaning the session has paused/stopped. If you swipe down on the photo, the session resumes. If you swipe and let it get canceled, quickly swipe again you can see the session pauses and resumes quickly over and over without any issues.
Currently, I have an app that does not collect any user data. So the App privacy listing on the App Store is listed as "Does not collect any data."
However, it still receives app crashes and other reports that are available via Xcode Organizer. I believe the user enables this via the prompt "Share analytics, diagnostics, and usage information with Apple" at the system level.
So given above, do I need to change the listing to say it explicitly collects crash data?
New variants of SF Pro were introduced for iOS 16, but I do not see an update to UIFont API to use them.
https://developer.apple.com/documentation/uikit/uifont
Another thread on here specified that it could be done through UIFontDescriptor, but it doesn't seem like thee that is the official way.
I really wish there is more information on this. I am wanting to use Expanded for my iOS update.
I have a child view controller added and its view gets viewSafeAreaInsetsDidChange() called every time a frame change happens. how do I avoid this?
So far I am using these:
self.viewRespectsSystemMinimumLayoutMargins = false
self.view.insetsLayoutMarginsFromSafeArea = false
self.view.preservesSuperviewLayoutMargins. = false
However, viewSafeAreaInsetsDidChange() is till being called. Is there a way to stop that?
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?