On the WWDC24 session video 'Enhance your UI animations and transitions', Appls shows these new animation methods for UIKIT:
switch gesture.state {
case .changed:
UIView. animate(.interactiveSpring) {
bead.center = gesture.translation
}
case .ended:
UIView. animate(spring) {
bead.center = endOfBracelet
}
}
As of iOS 18 Beta 2, I get an error for `UIView. animate(.interactiveSpring)`
These new methods are not available yet?
Post
Replies
Boosts
Views
Activity
I already have an iOS 17 App Intent that works with a URL:
@available(iOS 16, *)
struct MyAppIntent: AppIntent {
static let title : LocalizedStringResource = "My App Inent"
static let openAppWhenRun : Bool = true
@MainActor
func perform() async throws -> some IntentResult{
await UIApplication.shared.open(URL(string: "myapp://myappintent")!)
return .result()
}
}
Now, with iOS 18 and Control Widgets, I want to create a Control Widget button that smply opens the app with the same URL. However UIApplication code is not allowed within extensions. For this, Apple says to use OpenIntent which is shown here:
Link
Apple Sample Code from the link:
import AppIntents
struct LaunchAppIntent: OpenIntent {
static var title: LocalizedStringResource = "Launch App"
@Parameter(title: "Target")
var target: LaunchAppEnum
}
enum LaunchAppEnum: String, AppEnum {
case timer
case history
static var typeDisplayRepresentation = TypeDisplayRepresentation("Productivity Timer's app screens")
static var caseDisplayRepresentations = [
LaunchAppEnum.timer : DisplayRepresentation("Timer"),
LaunchAppEnum.history : DisplayRepresentation("History")
]
}
WWDC session video about this does not cover this particular method in detail and also this sample code is a bit confusing.
So how can I alter this code to just open the app with a URL?
With iOS 18, when you tint Home Screen, the widgets also need to pick up the tint. How do you detect this within SwiftUI?
I didn't see WWDC24 sessions addressing this.
After the session video, "Build a great Lock Screen camera capture experience", was unclear about the UI.
So do developers need to provide a whole new UI in the extension? The main UI cannot be repurposed?
WWDC23 Platform State of the Union mentioned that Volume shutter buttons to trigger the camera shutter is coming later this year. This was mentioned at 0:30:15.
Would anyone know when this will be available?
With AVFoundation, how do you set up the new 24MP capture on new iPhone 15 models?
I strongly believed it was in videodevice.activeFormat.supportedMaxPhotoDimensions array, but not.
I am trying to display HDR Images (ProRAW) within UIImageView using preferredImageDynamicRange. This was shown in a 2023 WWDC Video
let imageView = UIImageView()
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
self.imageView.clipsToBounds = true
self.imageView.isMultipleTouchEnabled = true
self.imageView.contentMode = .scaleAspectFit
self.photoScrollView.addSubview(self.imageView)
I pull the image from PHImageManager:
let options = PHImageRequestOptions()
options.deliveryMode = .highQualityFormat
options.isNetworkAccessAllowed = true
PHImageManager.default().requestImage(for: asset, targetSize: self.targetSize(), contentMode: .aspectFit, options: options, resultHandler: { image, info in
guard let image = image else {
return
}
DispatchQueue.main.async {
self.imageView.image =image
if #available(iOS 17.0, *) {
self.imageView.preferredImageDynamicRange = UIImage.DynamicRange.high
}
}
}
Issue
The image shows successfully, yet not in HDR mode (no bright specular highlights, as seen when the same image ((ProRAW) is pulled on the native camera app.
What am I missing here?
I have a child view controller added and its view gets viewSafeAreaInsetsDidChange() called every time a frame change happens. how do I avoid this?
So far I am using these:
self.viewRespectsSystemMinimumLayoutMargins = false
self.view.insetsLayoutMarginsFromSafeArea = false
self.view.preservesSuperviewLayoutMargins. = false
However, viewSafeAreaInsetsDidChange() is till being called. Is there a way to stop that?
New variants of SF Pro were introduced for iOS 16, but I do not see an update to UIFont API to use them.
https://developer.apple.com/documentation/uikit/uifont
Another thread on here specified that it could be done through UIFontDescriptor, but it doesn't seem like thee that is the official way.
I really wish there is more information on this. I am wanting to use Expanded for my iOS update.
Currently, I have an app that does not collect any user data. So the App privacy listing on the App Store is listed as "Does not collect any data."
However, it still receives app crashes and other reports that are available via Xcode Organizer. I believe the user enables this via the prompt "Share analytics, diagnostics, and usage information with Apple" at the system level.
So given above, do I need to change the listing to say it explicitly collects crash data?
With the new camera privacy indicator, there is a new need to pause the session at certain times, ie when the full-screen view is presented over the camera preview.
This was the only solution for me:
func pauseSession () {
self.sessionQueue.async {
if self.session.isRunning {
self.session.stopRunning()
}
}
}
func resumeSession () {
self.sessionQueue.async {
if !self.session.isRunning {
self.session.startRunning()
}
}
}
This seems to be an expensive operation that takes time.
The issue I seem to have is if pause and resume are called near each other in time, the whole app freezes for about 10 seconds, till going back to being responsive. This is mostly due to it still hasn't finished the last process (whether to stop or start).
Is there a solution to this?
The native camera app seems to do this fine. If you open it, open the last photo, you can see the green indicator on the top right going off (after becoming orange briefly), meaning the session has paused/stopped. If you swipe down on the photo, the session resumes. If you swipe and let it get canceled, quickly swipe again you can see the session pauses and resumes quickly over and over without any issues.
On the crash reports, I am getting a lot of crashes with this crash report for two iOS 14 Widgets:
Foundation		_NSFileHandleRaiseOperationException
....
SwiftUI			 FileArchiveWriter.AppendByes(_size:)
....
My Widget			 closure #1 in MyTimelineProver.getTimeline(in:completion:)
When checked within the project, this happens in:
func getTimeline(in context: Self.Context, completion: @escaping (Timeline<Self.Entry>) -> Void)
I have one timeline that asked to update every 5 mins.
Any idea on when AppleRAW API's would be available?
Is it possible for a widget to be reloaded when the photo library changes?
Outside of Widgets, there is
PHPhotoLibraryChangeObserver
Is it possible to use something similar to update the timeline whenever there is a change in the photo library?
TimelineProvider.getTimeline(in:completion:)
With iOS 14 WidgetKit, I am trying to display a widget with a 'particular' photo asset from the library. With a medium-size widget, I want the photo asset to fill the whole area (Aspect Fill), just like the native Photos widget.
struct My_WidgetEntryView : View
{
var entry : Provider.Entry
var body: some View
{
PhotoView(user: 1)
}
}
struct PhotoView: View
{
var user : Int
@ObservedObject private var assetImageModel = AssetImageModel()
init(user: Int)
{
self.user = user
self.assetImageModel.loadAsset()
}
var body: some View {
if self.assetImageModel.assetImage != nil {
Image(uiImage: self.assetImageModel.assetImage!)
.resizable()
} else {
Image(uiImage: UIImage(systemName: "photo")!)
.resizable()
}
}
}
class AssetImageModel: ObservableObject {
@Published var assetImage: UIImage?
init() {
loadAsset()
}
func loadAsset() {
if PHPhotoLibrary.authorizationStatus() == .authorized {
let allPhotosOptions = PHFetchOptions()
allPhotosOptions.sortDescriptors = [NSSortDescriptor(key: "creationDate", ascending: true)]
let assetsFetchResults = PHAsset.fetchAssets(with: allPhotosOptions)
if assetsFetchResults.count > 0 {
let asset = assetsFetchResults.lastObject
if asset != nil {
let options = PHImageRequestOptions()
options.isSynchronous = true
options.deliveryMode = .highQualityFormat
options.resizeMode = .exact
options.version = .current
PHCachingImageManager().requestImage(for: asset!, targetSize: CGSize(width: 360 * UIScreen.main.scale, height: 170 * UIScreen.main.scale), contentMode: .aspectFill, options: options, resultHandler: { image, _ in
//DispatchQueue.main.async {
self.assetImage = image
//}
})
} else {
//Asset was nil
}
} else {
//No assets
}
} else {
//No persmission
}
}
}
When the above is run, I get a crash with a memory warning stating allowed 30 MB memory limit has exceeded. This obvious due to this:
For the medium-size widget, I had to go by a 360 x 170 point size since there was no way to get the actual view dimensions before the the view gets displayed.
The actual target size then has to be multiplied by the screen scale, at the moment the test device is a 3x. Aspect Fill as to be the fill type. Due to this, PHCachingImageManageris returning a 360-point width with 480-point height for a portrait photo in order to fill the view. The final dimensions of the image end up being 1080 x 1440 pixel size. I imagine this is the reason for memory pressure.
Question
Is there a way to ask for a properly cropped 360 x 170 point size image? The stock Photos widget seems to manage this.
Things I Tried I tried normalizedCropRect in PHImageRequestOptions and had no luck.
Lowering the image size by a huge margin works, but the image looks blurred visually.