I used a package which contains a XCFramework inside it in my widget target. It’s working fine on iOS, and macOS Widget Simulator.
But when I open widget gallery on macOS, I can’t find my widget.
I tried to run my widget directly inside /Contents/PlugIns/WidgetTestExtension.appex/Contents/MacOS/WidgetTestExtension, it prints out the error: dyld[4767]: Library not loaded: @rpath/myframework.framework/Versions/A/myframework
Post
Replies
Boosts
Views
Activity
Description
In Live Activities, we saw many beautiful animations that powered by .numericText() like Text(time, style: .timer), or even Text with .contentTransition(.numericText()) applied.
But it seems like in normal SwiftUI View, these beautiful animations are gone. Instead, we saw a blinking result or fade-in-out result.
Is that exactly right?
ScreenShots
In Live Activity:
In normal SwiftUI View:
I'm trying to copying the Colorful Confetti effect in iMessage using SwiftUI Canvas and I am wondering how to apply 3D transformation on each particle.
I have tried to add a projectionTransform in order to apply a CATransform3D, but it rotates all the canvas, not a particular particle, which is not the effect I want.
Currently, I use the very basic ForEach(particles.indices, id: \.self) loop to create each particle and use .rotation3DEffect to apply that transformation, but it may result in a performance issue (so, I tried to use .drawingGroup()).
Is there any solutions to apply 3D transformation to a particular particle in a Canvas??
My code (using ForEach loop):
GeometryReader { proxy in
let size = proxy.size
TimelineView(.animation) { timeline in
let _: () = {
let now = timeline.date.timeIntervalSinceReferenceDate
model.update(at: now)
}()
ZStack {
ForEach(model.particles.indices, id: \.self) { index in
let particle = model.particles[index]
particle.shape
.fill(particle.color)
.rotation3DEffect(.degrees(particle.degrees), axis: (x: particle.x, y: particle.y, z: particle.z))
.frame(width: particle.frame.width, height: particle.frame.height)
.position(particle.frame.origin)
.tag(index)
}
}
.frame(width: size.width, height: size.height)
.drawingGroup()
}
.contentShape(Rectangle())
.gesture(
DragGesture(minimumDistance: 0)
.onEnded { _ in model.loadEffect(in: size) }
)
.task { model.loadEffect(in: size) }
}
I created a function to add my course events to the calendar app using EventKit.
After learning the swift concurrency, I want to update my code to make the progress much faster, namely using the detached task or TaskGroup to add these events.
Synchronize code without detached task or task group:
func export_test() {
Task.detached {
for i in 0...15 {
print("Task \(i): Start")
let courseEvent = EKEvent(eventStore: eventStore)
courseEvent.title = "TEST"
courseEvent.location = "TEST LOC"
courseEvent.startDate = .now
courseEvent.endDate = .now.addingTimeInterval(3600)
courseEvent.calendar = eventStore.defaultCalendarForNewEvents
courseEvent.addRecurrenceRule(EKRecurrenceRule(recurrenceWith: .daily, interval: 1, end: nil))
do {
try eventStore.save(courseEvent, span: .futureEvents)
} catch { print(error.localizedDescription) }
print("Task \(i): Finished")
}
}
}
Doing the same thing using the TaskGroup :
func export_test() {
Task.detached {
await withTaskGroup(of: Void.self) { group in
for i in 0...15 {
group.addTask {
print("Task \(i): Start")
let courseEvent = EKEvent(eventStore: eventStore)
courseEvent.title = "TEST"
courseEvent.location = "TEST LOC"
courseEvent.startDate = .now
courseEvent.endDate = .now.addingTimeInterval(3600)
courseEvent.calendar = eventStore.defaultCalendarForNewEvents
courseEvent.addRecurrenceRule(EKRecurrenceRule(recurrenceWith: .daily, interval: 1, end: nil))
do {
try eventStore.save(courseEvent, span: .futureEvents)
} catch { print(error.localizedDescription) }
print("Task \(i): Finished")
}
}
}
}
}
The output of the TaskGroup version:
Task 0: Start
Task 1: Start
Task 2: Start
Task 4: Start
Task 3: Start
Task 5: Start
Task 6: Start
Task 7: Start
Task 0: Finished
Task 8: Start
Task 1: Finished
Task 9: Start
Sometimes, only a few tasks will been done, and others will not, or even never been started (I created 16 tasks but only printed 9 in this example). Sometimes, all of these events can be added.
In my point of view, I have created 16 child tasks in the TaskGroup.
Each child task will add one event to the Calendar. I think in this way, I can take the full advantage of the multi-core performance (maybe it's actually not. 🙃)
If I put the for-loop inside the group.addTask closure, it will always have the expected result, but in this way, we only have a single loop so the TaskGroup may no longer needed.
I'm really exhausted🙃🙃.
Background
I use AVAssetWriterInput.append to append sample buffer to the writer. Sometimes, I switch off the audio input(if user wants to temporarily disable audio input), so the append method will not be executed while the append method in video input will always be executed.
Problem
If user pause the audio and resume it later. The audio after resuming will immediately begin when user pause it (in the final video).
Example
'=' refers to CMSampleBuffer.
'|' means user paused the audio input.
Video: ---------------=================================
Audio(expected): ----=======|----------------=============
Audio(I got): ---------=======|=============----------------
I have printed the presentationTime from the audio sample buffer, it turns out it's correct.
Maybe my understanding to the AVAssetWriterInput.append is wrong?
My current solution is to always append the buffer, but when user wants to pause, I simply append an empty SampleBuffer filled with nothing.
I don't think this is the best way to deal with it.
Is there any idea to sync the buffer time with the video??
I created a screen recording app.
Before appending the buffer to the writer, I check if the isReadyForMoreMediaData is true.
But sometimes it will constantly return false and won’t become true until I call markAsFinished
I have no idea how to fix this.
let videoCompressionProperties = [
AVVideoAverageBitRateKey: resolution.dataRate
]
let videoSettings: [String: Any] = [
AVVideoCodecKey: encoding,
AVVideoWidthKey: resolution.size.width,
AVVideoHeightKey: resolution.size.height,
AVVideoCompressionPropertiesKey: videoCompressionProperties,
AVVideoScalingModeKey: AVVideoScalingModeResizeAspect,
AVVideoColorPropertiesKey: colorSpace.properties.dictionary,
]
videoInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoSettings)
videoInput?.expectsMediaDataInRealTime = true
Append buffers:
if videoInput?.isReadyForMoreMediaData == true {
guard buffer.imageBuffer != nil else { return }
processQueue.async { [self] in
videoInput?.append(buffer)
}
} else {
logger.warning("Dropped a frame.")
// I found this in stackoverflow. But no effect
// RunLoop.current.run(until: .now.addingTimeInterval(0.1))
// The code below will notify me when the bug occurrs.
// errorFixedPublisher.send()
// if !self.errorOccured {
// self.errorOccured = true
// let center = UNUserNotificationCenter.current()
// let content = UNMutableNotificationContent()
// content.sound = UNNotificationSound.defaultCritical
// content.badge = 1
// content.body = "Dropping frames at \(Date.now.formatted(date: .omitted, time: .standard))"
// let trigger = UNTimeIntervalNotificationTrigger(timeInterval: 1, repeats: false)
// let req = UNNotificationRequest(identifier: "ASWRI_ERR", content: content, trigger: trigger)
// center.add(req)
}
}
Any idea on it?? Please help me.
Have a good day!
If you configure a Window and a MenuBarExtra in your app, try this:
Open your app and press Command+H to hide it.
Place an iPad or another Mac and connect two devices via Universal Control, and click on the other device to make your mac's menu bar inactive.
Move your cursor back to your Mac(first device), and click on the whitespace of the Menu Bar.
Your App is terminated unexpectedly.
This behavior will only appear on those apps using Window, so to solve this issue, we may need to switch Window to WindowGroup which is not what we want.
I reappear this issue on Apple's ScreenCaptureKit Sample Code. This is a SwiftUI Scene BUG, please fix this.
@main
struct CaptureSampleApp: App {
var body: some Scene {
Window("ID", id: "ID") {
ContentView()
.frame(minWidth: 960, minHeight: 724)
.background(.black)
}
MenuBarExtra(
"App Menu Bar Extra", systemImage: "star"
) {
Text("Hello")
}
}
}
Filed a feedback: FB11447959.
If I set queueDepth to a value that smaller than 6 ( 5 for example ), the stream will be paused when I start capturing and make it minimized or hidden.
Is that a correct behavior or a bug??
Filed a report: FB11441320
Menu {
ShareLink(...)
ShareLink(...)
ShareLink(...)
} label: {
Label("Share", systemImage: "square.and.arrow.up")
}
Here, I have different share options to choose from, and if I tap one of these share link to share sheet doesn't pop up.
Is there any workaround to this issue?
Here is the implementation.
import SwiftUI
import PencilKit
struct DrawingCanvas_bug: UIViewRepresentable {
typealias UIViewType = PKCanvasView
private let toolPicker = PKToolPicker()
@Binding var drawing: PKDrawing
var isOpaque: Bool = true
var drawingDidChange: ((PKDrawing) -> Void)?
func makeUIView(context: Context) -> PKCanvasView {
let canvasView = PKCanvasView(frame: .zero)
canvasView.drawing = drawing
canvasView.delegate = context.coordinator
canvasView.backgroundColor = .clear
canvasView.isOpaque = isOpaque
canvasView.alwaysBounceVertical = true
// The size of the canvas is Zero, so I cannot update the ZoomScale now.
// context.coordinator.updateZoomScale(for: canvasView)
toolPicker.setVisible(true, forFirstResponder: canvasView)
toolPicker.addObserver(canvasView)
canvasView.becomeFirstResponder()
return canvasView
}
func updateUIView(_ canvasView: PKCanvasView, context: Context) {
DispatchQueue.main.async {
// We can get the correct ZoomScale and the correct ContentSize, but part of the drawing is not visible.
// Using the select tool can select them.
context.coordinator.updateZoomScale(for: canvasView)
}
}
func makeCoordinator() -> Coordinator {
Coordinator(self)
}
static func dismantleUIView(_ canvasView: PKCanvasView, coordinator: Coordinator) {
canvasView.resignFirstResponder()
}
}
extension DrawingCanvas_bug {
class Coordinator: NSObject, PKCanvasViewDelegate {
var host: DrawingCanvas_bug
init(_ host: DrawingCanvas_bug) {
self.host = host
}
func canvasViewDrawingDidChange(_ canvasView: PKCanvasView) {
host.drawing = canvasView.drawing
if let action = host.drawingDidChange {
action(canvasView.drawing)
}
updateContentSizeForDrawing(for: canvasView)
}
func updateZoomScale(for canvasView: PKCanvasView) {
let canvasScale = canvasView.bounds.width / 768
canvasView.minimumZoomScale = canvasScale
canvasView.maximumZoomScale = canvasScale
canvasView.zoomScale = canvasScale
updateContentSizeForDrawing(for: canvasView)
}
func updateContentSizeForDrawing(for canvasView: PKCanvasView) {
let drawing = canvasView.drawing
let contentHeight: CGFloat
if !drawing.bounds.isNull {
contentHeight = max(canvasView.bounds.height, (drawing.bounds.maxY + 500) * canvasView.zoomScale)
} else {
contentHeight = canvasView.bounds.height
}
canvasView.contentSize = CGSize(width: 768 * canvasView.zoomScale, height: contentHeight)
}
}
}
And here is how I use:
NavigationSplitView(columnVisibility: .constant(.doubleColumn)) {
List(selection: $selection) {
ForEach(drawingModel.drawings, id: \.uuidString) {
DrawingRow(drawingData: $0)
}
}
} detail: {
DrawingView()
}
.navigationSplitViewStyle(.balanced) // <- If I use automatic style, PKCanvasView's size is correct
How to add hosting base URL to a package(not a project)?
is it possible to add an existing Swift package to my project as a target?
App shortcuts can only be added to Shortcut Action List, but not a separate App Shortcuts appears at the bottom of the Shortcuts app.
Now it only has the default Voice Memo App.
It successfully appeared in beta 2/3, but I'm not sure if it appeared in (beta 3 update).
But in beta 4, it disappeared. I have no idea how to make it visible again!!!
In iOS 16, the primary action of the Menu cannot be triggered. The Menu itself will always pops up instead of performing the primary action.
I have tried to use iOS 15 simulator to preview the same code. It works well.
Is there any convenient way to back deploy the NavigationStack or NavigationSplitView??
The real problem is if I want to back support iOS 15 or 14, I must conditionally switch between NavigationView and NavigationStack / NavigationSplitView.
Here is how I did for NavigationStack, but I have no idea how to deal with NavigationSplitView
import SwiftUI
struct NavigationStack<Content: View>: View {
var content: () -> Content
var body: some View {
if #available(iOS 16.0, macOS 13.0, *) {
SwiftUI.NavigationStack {
content()
}
} else {
NavigationView {
content()
}.navigationViewStyle(.stack)
}
}
}
Will the new NavigationStack and NavigationSplitView back support old devices? I think these behaviors in previous OS is not new features.
Error Code: error build: Command CompileSwift failed with a nonzero exit code
My Code:
.backgroundTask(.appRefresh("checkValidity")) {
// scheduleAppRefresh()
// checkRecords()
}