In investigating a capture session crash, it's unclear what's causing occasional system pressure interruptions, except that it's happening on older iOS devices. Does Low Power Mode have a meaningful impact on whether these interruptions happen?
Post
Replies
Boosts
Views
Activity
It's unclear how to use this example given by Apple.
class MyViewController: UIViewController {
override var preferredContainerBackgroundStyle: UIContainerBackgroundStyle {
return .glass
}
}
If I try overriding the property, I get an error saying it only builds on visionOS. However, adding a compiler directive check for visionOS only returns true if the app is fully native for visionOS (no storyboards). Does this mean that UIKit enhancements are only supported for UIKit embedded in the fully native target?
This is verified to be a framework bug (occurs on Mac Catalyst but not iOS or iPadOS), and it seems the culprit is AVVideoCompositionCoreAnimationTool?
/// Exports a video with the target animating.
func exportVideo() {
let destinationURL = createExportFileURL(from: Date())
guard let videoURL = Bundle.main.url(forResource: "black_video", withExtension: "mp4") else {
delegate?.exporterDidFailExporting(exporter: self)
print("Can't find video")
return
}
// Initialize the video asset
let asset = AVURLAsset(url: videoURL, options: [AVURLAssetPreferPreciseDurationAndTimingKey: true])
guard let assetVideoTrack: AVAssetTrack = asset.tracks(withMediaType: AVMediaType.video).first,
let assetAudioTrack: AVAssetTrack = asset.tracks(withMediaType: AVMediaType.audio).first else { return }
let composition = AVMutableComposition()
guard let videoCompTrack = composition.addMutableTrack(withMediaType: AVMediaType.video, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)),
let audioCompTrack = composition.addMutableTrack(withMediaType: AVMediaType.audio, preferredTrackID: Int32(kCMPersistentTrackID_Invalid)) else { return }
videoCompTrack.preferredTransform = assetVideoTrack.preferredTransform
// Get the duration
let videoDuration = asset.duration.seconds
// Get the video rect
let videoSize = assetVideoTrack.naturalSize.applying(assetVideoTrack.preferredTransform)
let videoRect = CGRect(origin: .zero, size: videoSize)
// Initialize the target layers and animations
animationLayers = TargetView.initTargetViewAndAnimations(atPoint: CGPoint(x: videoRect.midX, y: videoRect.midY), atSecondsIntoVideo: 2, videoRect: videoRect)
// Set the playback speed
let duration = CMTime(seconds: videoDuration,
preferredTimescale: CMTimeScale(600))
let appliedRange = CMTimeRange(start: .zero, end: duration)
videoCompTrack.scaleTimeRange(appliedRange, toDuration: duration)
audioCompTrack.scaleTimeRange(appliedRange, toDuration: duration)
// Create the video layer.
let videolayer = CALayer()
videolayer.frame = CGRect(origin: .zero, size: videoSize)
// Create the parent layer.
let parentlayer = CALayer()
parentlayer.frame = CGRect(origin: .zero, size: videoSize)
parentlayer.addSublayer(videolayer)
let times = timesForEvent(startTime: 0.1, endTime: duration.seconds - 0.01)
let timeRangeForCurrentSlice = times.timeRange
// Insert the relevant video track segment
do {
try videoCompTrack.insertTimeRange(timeRangeForCurrentSlice, of: assetVideoTrack, at: .zero)
try audioCompTrack.insertTimeRange(timeRangeForCurrentSlice, of: assetAudioTrack, at: .zero)
}
catch let compError {
print("TrimVideo: error during composition: \(compError)")
delegate?.exporterDidFailExporting(exporter: self)
return
}
// Add all the non-nil animation layers to be exported.
for layer in animationLayers.compactMap({ $0 }) {
parentlayer.addSublayer(layer)
}
// Configure the layer composition.
let layerComposition = AVMutableVideoComposition()
layerComposition.frameDuration = CMTimeMake(value: 1, timescale: 30)
layerComposition.renderSize = videoSize
layerComposition.animationTool = AVVideoCompositionCoreAnimationTool(
postProcessingAsVideoLayer: videolayer,
in: parentlayer)
let instructions = initVideoCompositionInstructions(
videoCompositionTrack: videoCompTrack, assetVideoTrack: assetVideoTrack)
layerComposition.instructions = instructions
// Creates the export session and exports the video asynchronously.
guard let exportSession = initExportSession(
composition: composition,
destinationURL: destinationURL,
layerComposition: layerComposition) else {
delegate?.exporterDidFailExporting(exporter: self)
return
}
// Execute the exporting
exportSession.exportAsynchronously(completionHandler: {
if let error = exportSession.error {
print("Export error: \(error), \(error.localizedDescription)")
}
self.delegate?.exporterDidFinishExporting(exporter: self, with: destinationURL)
})
}
Not sure how to implement a custom compositor that performs the same animations as this reproducible case:
class AnimationCreator: NSObject {
// MARK: - Target Animations
/// Creates the target animations.
static func addAnimationsToTargetView(_ targetView: TargetView, startTime: Double) {
// Add the appearance animation
AnimationCreator.addAppearanceAnimation(on: targetView, defaultBeginTime: AVCoreAnimationBeginTimeAtZero, startTime: startTime)
// Add the pulse animation.
AnimationCreator.addTargetPulseAnimation(on: targetView, defaultBeginTime: AVCoreAnimationBeginTimeAtZero, startTime: startTime)
}
/// Adds the appearance animation to the target
private static func addAppearanceAnimation(on targetView: TargetView, defaultBeginTime: Double = 0, startTime: Double = 0) {
// Starts the target transparent and then turns it opaque at the specified time
targetView.targetImageView.layer.opacity = 0
let appear = CABasicAnimation(keyPath: "opacity")
appear.duration = .greatestFiniteMagnitude // stay on screen forever
appear.fromValue = 1.0 // Opaque
appear.toValue = 1.0 // Opaque
appear.beginTime = defaultBeginTime + startTime
targetView.targetImageView.layer.add(appear, forKey: "appear")
}
/// Adds a pulsing animation to the target.
private static func addTargetPulseAnimation(on targetView: TargetView, defaultBeginTime: Double = 0, startTime: Double = 0) {
let targetPulse = CABasicAnimation(keyPath: "transform.scale")
targetPulse.fromValue = 1 // Regular size
targetPulse.toValue = 1.1 // Slightly larger size
targetPulse.duration = 0.4
targetPulse.beginTime = defaultBeginTime + startTime
targetPulse.autoreverses = true
targetPulse.repeatCount = .greatestFiniteMagnitude
targetView.targetImageView.layer.add(targetPulse, forKey: "pulse_animation")
}
}
Using a modified version of the following example
var body: some View {
NavigationSplitView {
BackyardList(isSubscribed: isSubscribed, backyardLimit: passStatus.backyardLimit, onOfferSelection: showSubscriptionStore)
.navigationTitle("Backyard Birds")
.navigationDestination(for: Backyard.ID.self) { backyardID in
if let backyard = backyards.first(where: { $0.id == backyardID }) {
BackyardTabView(backyard: backyard)
}
}
} detail: {
ContentUnavailableView("Select a Backyard", systemImage: "bird", description: Text("Pick something from the list."))
}
.sheet(isPresented: $showingSubscriptionStore) {
SubscriptionStoreView(groupID: groupID)
}
.onInAppPurchaseCompletion { _, purchaseResult in
guard case .success(let verificationResult) = purchaseResult,
case .success(_) = verificationResult else {
return
}
showingSubscriptionStore = false
}
}
(from Apple's sample code demonstrating in-app purchases), I'm unable to complete a sandbox purchase on Apple Watch. I get the error in the UI
Unable to Purchase App
Sign in with your Apple ID from the Apple Watch app on your iPhone
and printing purchaseResult outputs failure(StoreKit.StoreKitError.unknown). An Apple ID is signed into Settings on iOS, as well as the Apple Watch app. This occurs whether or not a separate sandbox Apple ID is signed into Settings under App Store. The subscription options UI appears as expected before attempting to purchase one.
I'm following Displaying live data with Live Activities and adding support for the Dynamic Island in an app with an existing widgets extension. However, I'm unable too get it to build
import SwiftUI
import WidgetKit
@main
struct PizzaDeliveryActivityWidget: Widget {
var body: some WidgetConfiguration {
ActivityConfiguration(for: PizzaDeliveryAttributes.self) { context in
// Create the presentation that appears on the Lock Screen and as a
// banner on the Home Screen of devices that don't support the
// Dynamic Island.
// ...
} dynamicIsland: { context in
// Create the presentations that appear in the Dynamic Island.
// ...
}
}
}
as I get the errors
Generic parameter 'Expanded' could not be inferred
and
Result builder 'DynamicIslandExpandedContentBuilder' does not implement any 'buildBlock' or a combination of 'buildPartialBlock(first:)' and 'buildPartialBlock(accumulated:next:)' with sufficient availability for this call site
This is even if I supply a dynamicIsland closure of
DynamicIsland {
EmptyView()
} compactLeading: {
EmptyView()
} compactTrailing: {
EmptyView()
} minimal: {
EmptyView()
}
What am I missing? My project's minimum deployment target is iOS 16.1.
I work on an Apple Watch app that can preview what the iPhone camera sees, as well as controls it. I had one report of video quality seen on watchOS, both in resolution and FPS. The only time I've been able to reproduce this is with the "poor network quality" preset in the network link conditioner. Does this mean that the quality of the user's Wi-Fi connection is affecting their WatchConnectivity transfers?