I have a SwiftUI app that I've been working on in XCode 16.1. The project builds and runs in the simulators, on my mac and on my iPhone/iPad without any issues. I'm also able to build my unit test project and run them without any errors. The project has zero warnings in it.
When I go to the Edit Schemes options and change the Run scheme to be a Release build with the Debug Executable unchecked I get a compiler error:
Command SwiftCompile failed with a nonzero exit code
I've attempted this Release Run with the following target devices in XCode:
My iPhone 15 Pro Max (iOS 18.2 Beta 3)
MacBook Air (M1) (15.2 Beta)
iPhone 16 Simulator (iOS 18.1)
Any iOS Simulator Device (arm64, x86_64)
All 3 of these target have the same issue. Normally I would just debug the error from the logs but when I look at the build output I can't see any information in the log to tell me what happened. It looks like the source files are sent into the SwiftCompiler and the compiler fails without bubbling up the issue.
I've provided the full error log export as a Gist HERE due to it's size. Is there anything in the log I'm missing? Is there a way for me to turn on more verbose logging during compilation of a Release Build?
I created a brand new Multiplatform App in XCode and I added all of my source files to it. No project configuration settings were changed. I could build it successfully with the debug configuration. I then changed it to the Release configuration and experienced the same error. I can create another fresh project and make the same release configuration with none of my source files in it and get a successful build. I
t seems there is something wrong with my source files and the release configuration but the compiler doesn't indicate what. I'm lost at this point as I can't figure out how to get a release build and can't seem to find any indication as to why.
Swift
RSS for tagSwift is a powerful and intuitive programming language for Apple platforms and beyond.
Posts under Swift tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Hi, I'm trying to modify the ScreenCaptureKit Sample code by implementing an actor for Metal rendering, but I'm experiencing issues with frame rendering sequence.
My app workflow is:
ScreenCapture -> createFrame -> setRenderData
Metal draw callback -> renderAsync (getData from renderData)
I've added timestamps to verify frame ordering, I also using binarySearch to insert the frame with timestamp, and while the timestamps appear to be in sequence, the actual rendering output seems out of order.
// ScreenCaptureKit sample
func createFrame(for sampleBuffer: CMSampleBuffer) async {
if let surface: IOSurface = getIOSurface(for: sampleBuffer) {
await renderer.setRenderData(surface, timeStamp: sampleBuffer.presentationTimeStamp.seconds)
}
}
class Renderer {
...
func setRenderData(surface: IOSurface, timeStamp: Double) async {
_ = await renderSemaphore.getSetBuffers(
isGet: false,
surface: surface,
timeStamp: timeStamp
)
}
func draw(in view: MTKView) {
Task {
await renderAsync(view)
}
}
func renderAsync(_ view: MTKView) async {
guard await renderSemaphore.beginRender() else { return }
guard let frame = await renderSemaphore.getSetBuffers(
isGet: true, surface: nil, timeStamp: nil
) else {
await renderSemaphore.endRender()
return }
guard let texture = await renderSemaphore.getRenderData(
device: self.device,
surface: frame.surface) else {
await renderSemaphore.endRender()
return
}
guard let commandBuffer = _commandQueue.makeCommandBuffer(),
let renderPassDescriptor = await view.currentRenderPassDescriptor,
let renderEncoder = commandBuffer.makeRenderCommandEncoder(descriptor: renderPassDescriptor) else {
await renderSemaphore.endRender()
return
}
// Shaders ..
renderEncoder.endEncoding()
commandBuffer.addCompletedHandler() { @Sendable (_ commandBuffer)-> Swift.Void in
updateFPS()
}
// commit frame in actor
let success = await renderSemaphore.commitFrame(
timeStamp: frame.timeStamp,
commandBuffer: commandBuffer,
drawable: view.currentDrawable!
)
if !success {
print("Frame dropped due to out-of-order timestamp")
}
await renderSemaphore.endRender()
}
}
actor RenderSemaphore {
private var frameBuffers: [FrameData] = []
private var lastReadTimeStamp: Double = 0.0
private var lastCommittedTimeStamp: Double = 0
private var activeTaskCount = 0
private var activeRenderCount = 0
private let maxTasks = 3
private var textureCache: CVMetalTextureCache?
init() {
}
func initTextureCache(device: MTLDevice) {
CVMetalTextureCacheCreate(kCFAllocatorDefault, nil, device, nil, &self.textureCache)
}
func beginRender() -> Bool {
guard activeRenderCount < maxTasks else { return false }
activeRenderCount += 1
return true
}
func endRender() {
if activeRenderCount > 0 {
activeRenderCount -= 1
}
}
func setTextureLoaded(_ loaded: Bool) {
isTextureLoaded = loaded
}
func getSetBuffers(isGet: Bool, surface: IOSurface?, timeStamp: Double?) -> FrameData? {
if isGet {
if !frameBuffers.isEmpty {
let frame = frameBuffers.removeFirst()
if frame.timeStamp > lastReadTimeStamp {
lastReadTimeStamp = frame.timeStamp
print(frame.timeStamp)
return frame
}
}
return nil
} else {
// Set
let frameData = FrameData(
surface: surface!,
timeStamp: timeStamp!
)
// insert to the right position
let insertIndex = binarySearch(for: timeStamp!)
frameBuffers.insert(frameData, at: insertIndex)
return frameData
}
}
private func binarySearch(for timeStamp: Double) -> Int {
var left = 0
var right = frameBuffers.count
while left < right {
let mid = (left + right) / 2
if frameBuffers[mid].timeStamp > timeStamp {
right = mid
} else {
left = mid + 1
}
}
return left
}
// for setRenderDataNormalized
func tryEnterTask() -> Bool {
guard activeTaskCount < maxTasks else { return false }
activeTaskCount += 1
return true
}
func exitTask() {
activeTaskCount -= 1
}
func commitFrame(timeStamp: Double,
commandBuffer: MTLCommandBuffer,
drawable: MTLDrawable) async -> Bool {
guard timeStamp > lastCommittedTimeStamp else {
print("Drop frame at commit: \(timeStamp) <= \(lastCommittedTimeStamp)")
return false
}
commandBuffer.present(drawable)
commandBuffer.commit()
lastCommittedTimeStamp = timeStamp
return true
}
func getRenderData(
device: MTLDevice,
surface: IOSurface,
depthData: [Float]
) -> (MTLTexture, MTLBuffer)? {
let _textureName = "RenderData"
var px: Unmanaged<CVPixelBuffer>?
let status = CVPixelBufferCreateWithIOSurface(kCFAllocatorDefault, surface, nil, &px)
guard status == kCVReturnSuccess, let screenImage = px?.takeRetainedValue() else {
return nil
}
CVMetalTextureCacheFlush(textureCache!, 0)
var texture: CVMetalTexture? = nil
let width = CVPixelBufferGetWidthOfPlane(screenImage, 0)
let height = CVPixelBufferGetHeightOfPlane(screenImage, 0)
let result2 = CVMetalTextureCacheCreateTextureFromImage(
kCFAllocatorDefault,
self.textureCache!,
screenImage,
nil,
MTLPixelFormat.bgra8Unorm,
width,
height,
0, &texture)
guard result2 == kCVReturnSuccess,
let cvTexture = texture,
let mtlTexture = CVMetalTextureGetTexture(cvTexture) else {
return nil
}
mtlTexture.label = _textureName
let depthBuffer = device.makeBuffer(bytes: depthData, length: depthData.count * MemoryLayout<Float>.stride)!
return (mtlTexture, depthBuffer)
}
}
Above's my code - could someone point out what might be wrong?
Hello,
I am using Xcode 16.1 (16B40) on MacOS Sequoia 15.1.0 using a Macbook pro M1 Max
I am developing an app for iOS 17 and 18 using SwiftUI
I created UITests to take the screenshots for the appStore on the simulator
The tests run well and all of them are succeded
The problem appears when I try to get the screenshot files from the xcresult files after the test. There is not any screenshot inside it.
I found a data folder and a Info.plist file. In the data folder there are a lot of files with this pattern data.03zD4C6IGFFthK14NwA8mNvcwFHT16g6Tl40Tl1YmBC1bNh6d0YIcnWKyUaQPDXoa8fYo6C3Xcv8xvMtE3_NEXA== and other files with this pattern refs.03zD4C6IGFFthK14NwA8mNvcwFHT16g6Tl40Tl1YmBC1bNh6d0YIcnWKyUaQPDXoa8fYo6C3Xcv8xvMtE3_NEXA==
Ok, I tryed to use fastlane to automatize the screenshots but the problem is still present. The xcresult files have not any png file.
I had no problems doing this action (getting screenshots from a xcresult file) in previous versions of MacOS and Xcode in my current machine.
I just updated my machine to MacOS Sequoia 15.1.1 and the problem is still present
Honestly I don't know how to fix this situation.
With Xcode 15 I had not any problem with that but I am not sure if Xcode 16.0 was runing without problems because I didn't need to use this functionality in those months
Here is my code for a UITest:
import XCTest
final class ScreenshotsUITests: XCTestCase {
let app = XCUIApplication()
let device = "iPhone16"
override func setUpWithError() throws {
continueAfterFailure = true
}
override func tearDownWithError() throws {}
@MainActor
func testEnglishScreens() throws {
let lang = "en"
app.launchArguments.append("UITestMode")
app.launchArguments += ["-AppleLanguages", "(en)"]
app.launchArguments += ["-AppleLocale", "en_US"]
app.launch()
executeTestsForMenus(lang: lang, backLabel: "Back")
executeTestForMatch(lang: lang)
}
@MainActor
func testSpanishScreens() throws {
let lang = "es"
app.launchArguments.append("UITestMode")
app.launchArguments += ["-AppleLanguages", "(es)"]
app.launchArguments += ["-AppleLocale", "es_ES"]
app.launch()
executeTestsForMenus(lang: lang, backLabel: "Atrás")
executeTestForMatch(lang: lang)
}
private func executeTestForMatch(lang: String) {
let startButton = app.buttons["start-button"]
startButton.tap()
let key4 = app.buttons["key-4"]
XCTAssertTrue(key4.waitForExistence(timeout: 30), "Key 4 in match screen is not found")
key4.tap()
let key2 = app.buttons["key-2"]
XCTAssertTrue(key2.exists, "Key 2 in match screen is not found")
key2.tap()
makeScreenShot("playing", lang: lang)
let closeButton = app.buttons["close-button"]
XCTAssertTrue(closeButton.exists, "Close button in match screen is not found")
closeButton.tap()
}
private func executeTestsForMenus(lang: String, backLabel: String) {
let mainHeader = app.staticTexts["Math match"]
XCTAssertTrue(mainHeader.exists, "Header in main screen is not found")
makeScreenShot("mainMenu", lang: lang)
let settingsButton = app.buttons["settings-button"]
XCTAssertTrue(settingsButton.exists, "Settings button in main screen is not found")
settingsButton.tap()
makeScreenShot("Settings", lang: lang)
let backButton = app.buttons[backLabel]
XCTAssertTrue(backButton.exists, "Back button in match screen is not found")
backButton.tap()
let helpButton = app.buttons["help-button"]
XCTAssertTrue(helpButton.exists, "Help button in main screen is not found")
helpButton.tap()
makeScreenShot("Help", lang: lang)
backButton.tap()
let scoreButton = app.buttons["score-button"]
XCTAssertTrue(scoreButton.exists, "Scores button in main screen is not found")
scoreButton.tap()
makeScreenShot("Scores", lang: lang)
backButton.tap()
let playButton = app.buttons["play-button"]
XCTAssertTrue(playButton.exists, "Play button in main screen is not found")
playButton.tap()
makeScreenShot("matchBuilder", lang: lang)
let startButton = app.buttons["start-button"]
XCTAssertTrue(startButton.exists, "Start button in match builder screen is not found")
}
private func makeScreenShot(_ name: String, lang: String) {
takeScreenshot(app, named: "\(lang)-\(name)-\(device)")
}
}
import XCTest
extension XCTestCase {
func takeScreenshot(_ app: XCUIApplication, named name: String, fullScreen: Bool = false) {
let screenshot: XCUIScreenshot
if fullScreen {
screenshot = app.windows.firstMatch.screenshot()
} else {
screenshot = XCUIScreen.main.screenshot()
}
let screenshotAttachment = XCTAttachment(
uniformTypeIdentifier: "public.png",
name: "screenshot-\(name).png",
payload: screenshot.pngRepresentation,
userInfo: nil)
screenshotAttachment.lifetime = .keepAlways
add(screenshotAttachment)
}
}
and here is the content of my testplan file:
{
"configurations" : [
{
"id" : "35BC7C0B-9A5A-4027-9F30-36958C4C1AAF",
"name" : "Test Scheme Action",
"options" : {
"preferredScreenCaptureFormat" : "screenshot",
"testExecutionOrdering" : "random",
"uiTestingScreenshotsLifetime" : "keepAlways",
"userAttachmentLifetime" : "keepAlways"
}
}
],
"defaultOptions" : {
"targetForVariableExpansion" : {
"containerPath" : "container:myAppProject.xcodeproj",
"identifier" : "B27D1B022CA00314001A259B",
"name" : "MyAppProject"
}
},
"testTargets" : [
{
"parallelizable" : true,
"target" : {
"containerPath" : "container:MyAppProject.xcodeproj",
"identifier" : "B27D1B122CA00315001A259B",
"name" : "MyAppProjectTests"
}
},
{
"parallelizable" : true,
"target" : {
"containerPath" : "container:MyAppProject.xcodeproj",
"identifier" : "B27D1B1C2CA00315001A259B",
"name" : "MyAppProjectUITests"
}
}
],
"version" : 1
}
I made tests with old projects in my machine and those projects have the same problem with screenshot files in the xcresult bundles
I don't know if the problem is in my machine, my Xcode, MacOS or other ting. I don't know how to fix this problem
Please, can anyone help me?
Thanks in advance
Is there a way to know the event of user unlocking on iOS Device in Application?
Hi team,
We're using CocoaPods in our project and we noticed the compiler fails to build certain targets saying it's "Missing required module 'x'" when trying to build them in Swift 6:
We realized the modules the compiler is complaining about are pod dependencies required by the other target dependencies, and that this error will only appear when building with Swift 6 unless such dependencies are described on the Podfile as direct dependencies of the target, or we include them in the framework search paths. For example, the error in the image above will show if module 'X' import 'Y' and 'Y' imports 'CocoaLumberJack' and we don't specify a direct dependency between 'X' and 'CocoaLumberJack'.
Regardless of the fact that we can manually add the missing module location to the target search paths, we'd like to understand why we're facing this issue in the first place, what changed between Swift 5 and Swift 6 that's requiring us now to explicitly describe these dependencies. I'd appreciate if someone could tell us more about this. We're particularly interested on knowing if this is an intentional change and how to handle it properly.
Thanks
Hi the app crashes with message: **thunk for @escaping @callee_guaranteed (@guaranteed NSURLSessionTask) -> () + 48 (:0) ** and I am unable to understand what it means or why is it happening. Want to understand the reason for the message.
Thank you.
Hi,
I'm developing a musicKit integration in my iOS App, and I want to select songs from recently played (done it), the problem is that the queue is not auto-generated and the user have to select other song if they want to go forward.
There is any method to ask for similar songs, or recommended songs, from a song that the user has already selected?
It will be really great :)
Also if you know it... There is any publisher for the music duration or I need to do a timer?? Thanks.
David.
I have a swiftui view with Button(intent: ) and using UIHostingViewcontroller to use it in UIKit. The problem is that button not works in uikit but normal button(without intent works)
Hello everyone,
I'm currently working through this example project Exploring object tracking with ARKit to learn how to use Object Tracking with visionOS.
I was able to modify the example project to overlay a different model onto the detected object. However, with the example code, I'm only able to track 1 instance of the trained object at a time. I'm wondering if there is a way to track multiple instances of 1 object?
For example, I have a usdz model of a box, then trained that box for object tracking, I'm able to overlay a model of a chair over that box once it's detected. But now, I have multiple copies of that same box, and I want to arrange them so that when I wear the Vision Pro, I can see the chairs arranged however I want.
I'm still new to visionOS development, so I'm not sure if there's a way to accomplish that by just training 1 object and having copies of it.
If it helps, this is my current modification to overlay a virtual object ontop of a detected object.
func loadReferenceObjects() async -> [ReferenceObject] {
// Get a list of all reference object files in the app's main bundle and attempt to load each.
var referenceObjectFiles: [String] = []
if let resourcesPath = Bundle.main.resourcePath {
print("resource path: \(resourcesPath)")
try? referenceObjectFiles = FileManager.default.contentsOfDirectory(atPath: resourcesPath).filter { $0.hasSuffix(".referenceobject") }
}
await withTaskGroup(of: Void.self) { group in
for file in referenceObjectFiles {
let objectURL = Bundle.main.bundleURL.appending(path: file)
group.addTask {
// get the file name
let fileNameWithoutExtension = (file as NSString).deletingPathExtension
// load each ref objs as task
await self.loadSingleReferenceObject(url: objectURL, fileName: fileNameWithoutExtension)
}
}
}
return self.referenceObjects
}
// Private helper method to load a single object
// and assign entity to it.
private func loadSingleReferenceObject(url: URL, fileName: String) async {
var referenceObject: ReferenceObject
do {
print("Loading reference object from \(url)")
// Load the file as a `ReferenceObject` - this can take a while for larger objects.
try await referenceObject = ReferenceObject(from: url)
} catch {
fatalError("Failed to load reference object with error \(error)")
}
// add the ref obj to the ref objs array
self.referenceObjects.append(referenceObject)
// entity with each file
var model: Entity = Entity()
// add entity according to the file name
switch fileName {
// Box1 ref obj binds with Chair1
case "Box1":
// try to load the model
do {
try await model = Entity(named: "chair1", in: realityKitContentBundle)
} catch {
print("Failed to load chair1")
}
// Box2 ref obj binds with Chair2
case "Box2":
// try to load the model
do {
try await model = Entity(named: "chair2", in: realityKitContentBundle)
} catch {
print("Failed to load chair2")
}
default:
print("no model associated with this file name: \(fileName)")
break
}
// map entity to ref objs
usdzPerReferenceObject[referenceObject.id] = model
}
Any help or suggestions would be greatly appreciated.
Thank you.
Hi,
I am implementing the synchronisation of SwiftData with CloudKit as described in the Apple Documentation titled - "Syncing model data across a person’s devices." My app runs fine on iPhone without activating CloudKit under "Signing and Capabilities" option. But when activated, I get a CoreData error with a code: 134060. My app is in development stage. The following is the code snippet for your reference taken from the main structure adopting the App protocol.
init() {
do {
#if DEBUG
let schema = Schema([
Debit.self,
Credit.self,
])
let modelConfiguration = ModelConfiguration(schema: schema, isStoredInMemoryOnly: false)
// Use an autorelease pool to make sure Swift deallocates the persistent
// container before setting up the SwiftData stack.
try autoreleasepool {
let desc = NSPersistentStoreDescription(url: modelConfiguration.url)
let opts = NSPersistentCloudKitContainerOptions(containerIdentifier: "iCloud.com.sureshco.MyFirstApp")
desc.cloudKitContainerOptions = opts
// Load the store synchronously so it completes before initializing the CloudKit schema.
desc.shouldAddStoreAsynchronously = false
if let mom = NSManagedObjectModel.makeManagedObjectModel(for: [
Debit.self,
Credit.self,
]) {
let container = NSPersistentCloudKitContainer(name: "MyFirstApp", managedObjectModel: mom)
container.persistentStoreDescriptions = [desc]
container.loadPersistentStores {_, err in
if let err {
fatalError(err.localizedDescription)
}
}
// Initialize the CloudKit schema after the store finishes loading.
try container.initializeCloudKitSchema()
// Remove and unload the store from the persistent container.
if let store = container.persistentStoreCoordinator.persistentStores.first {
try container.persistentStoreCoordinator.remove(store)
}
}
}
#endif
sharedModelContainer = try ModelContainer(for: schema, configurations: [modelConfiguration])
} catch {
fatalError(error.localizedDescription)
}
}
Any help will be greatly appreciated!
Regards
Suresh.
Capability to read and write ofd HFS disks on Mac has been removed since a long time.
Capability to simply read was also removed since Catalina I think.
That is surprising and sometimes frustrating. I still use a 90's MacBook for a few tasks and need from time to time to transfer files to newer Mac or read some old files stored on 3.5" disks.
Solution I use is to read the disk on an old Mac with MacOS 10.6 (I'm lucky enough to have kept one) and transfer to USB stick or airdrop…
As there is no USB port on the Macbook of course (and I have no more a working 56k modem to transfer by mail), only option if not 3,5" disk is using PCMCIA port on the MacBook for writing to an SD Card to be read in Mac Sonoma. But reading directly 3.5" disk would be great.
Hence my questions for the forum:
how hard would it be to write such a driver for READING only HFS on Mac Sonoma?
There are some software like FuseHFS. Did anyone experience it ? Did anyone have a look at the source code (said to be open source).
does anyone know why Apple removed such capability (I thought it was a tiny piece of code compared to the GB of present MacOS)?
Thanks for any insights on the matter.
Hello, I'm kinda new to Xcode and I was trying to get back to programming while I follow a course.
There seemed to be a problem with the preview and it looks like there was nothing wrong with the code, at least from what I know. I can say this because the code is the default one that Xcode gives you when you start a new project.
here is the error log.
== PREVIEW UPDATE ERROR:
SchemeBuildError: Failed to build the scheme “App”
linker command failed with exit code 1 (use -v to see invocation)
Link App.debug.dylib (arm64):
Undefined symbols for architecture arm64:
"_main", referenced from:
___debug_main_executable_dylib_entry_point in command-line-aliases-file
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
== PREVIEW UPDATE ERROR:
SchemeBuildError: Failed to build the scheme “App”
linker command failed with exit code 1 (use -v to see invocation)
Link App.debug.dylib (arm64):
Undefined symbols for architecture arm64:
"_main", referenced from:
___debug_main_executable_dylib_entry_point in command-line-aliases-file
ld: symbol(s) not found for architecture arm64
clang: error: linker command failed with exit code 1 (use -v to see invocation)
I am encountering an issue when making an API call using URLSession with DispatchQueue.global(qos: .background).async on a real device running tvOS 18. The code works as expected on tvOS 17 and in the simulator for tvOS 18, but when I remove the debug mode, After the API call it takes few mintues or 5 to 10 min to load the data on the real device.
Code: Here’s the code I am using for the API call:
appconfig.getFeedURLData(feedUrl: feedUrl, timeOut: kRequestTimeOut, apiMethod: ApiMethod.POST.rawValue) { (result) in
self.EpisodeItems = Utilities.sharedInstance.getEpisodeArray(data: result)
}
func getFeedURLData(feedUrl: String, timeOut: Int, apiMethod: String, completion: @escaping (_ result: Data?) -> ()) {
guard let validUrl = URL(string: feedUrl) else { return }
var request = URLRequest(url: validUrl, cachePolicy: .useProtocolCachePolicy, timeoutInterval: TimeInterval(timeOut))
let userPasswordString = "\(KappSecret):\(KappPassword)"
let userPasswordData = userPasswordString.data(using: .utf8)
let base64EncodedCredential = userPasswordData!.base64EncodedString(options: .lineLength64Characters)
let authString = "Basic \(base64EncodedCredential)"
let headers = [
"authorization": authString,
"cache-control": "no-cache",
"user-agent": "TN-CTV-\(kPlateForm)-\(kAppVersion)"
]
request.addValue("application/json", forHTTPHeaderField: "Content-Type")
request.httpMethod = apiMethod
request.allHTTPHeaderFields = headers
let response = URLSession.requestSynchronousData(request as URLRequest)
if response.1 != nil {
do {
guard let parsedData = try JSONSerialization.jsonObject(with: response.1!, options: .mutableContainers) as? AnyObject else {
print("Error parsing data")
completion(nil)
return
}
print(parsedData)
completion(response.1)
return
} catch let error {
print("Error: \(error.localizedDescription)")
completion(response.1)
return
}
}
completion(response.1)
}
import Foundation
public extension URLSession {
public static func requestSynchronousData(_ request: URLRequest) -> (URLResponse?, Data?) {
var data: Data? = nil
var responseData: URLResponse? = nil
let semaphore = DispatchSemaphore(value: 0)
let task = URLSession.shared.dataTask(with: request) { taskData, response, error in
data = taskData
responseData = response
if data == nil, let error = error {
print(error)
}
semaphore.signal()
}
task.resume()
_ = semaphore.wait(timeout: .distantFuture)
return (responseData, data)
}
public static func requestSynchronousDataWithURLString(_ requestString: String) -> (URLResponse?, Data?) {
guard let url = URL(string: requestString.checkValidUrl()) else { return (nil, nil) }
let request = URLRequest(url: url)
return URLSession.requestSynchronousData(request)
}
}
Issue Description: Working scenario: The API call works fine on tvOS 17 and in the simulator for tvOS 18. Problem: When running on a real device with tvOS 18, the API call takes time[enter image description here] when debug mode is disabled, but works fine when debug mode is enabled, Data is loading after few minutes.
Error message: Error Domain=WKErrorDomain Code=11 "Timed out while loading attributed string content" UserInfo={NSLocalizedDescription=Timed out while loading attributed string content} NSURLConnection finished with error - code -1001 nw_read_request_report [C4] Receive failed with error "Socket is not connected" Snapshot request 0x30089b3c0 complete with error: <NSError: 0x3009373f0; domain: BSActionErrorDomain; code: 1 ("response-not-possible")> tcp_input [C7.1.1.1:3] flags=[R] seq=817957096, ack=0, win=0 state=CLOSE_WAIT rcv_nxt=817957096, snd_una=275546887
Environment: Xcode version: 16.1 Real device: Model A1625 (32GB) tvOS version: 18.1
Debugging steps I’ve taken: I’ve verified that the issue does not occur in debug mode. I’ve confirmed that the API call works fine on tvOS 17 and in the simulator (tvOS 18). The error suggests a network timeout (-1001) and a socket connection issue ("Socket is not connected").
Questions:
Is this a known issue with tvOS 18 on real devices? Are there any specific settings or configurations in tvOS 18 that could be causing the timeout error in non-debug mode? Could this be related to how URLSession or networking behaves differently in release mode? I would appreciate any help or insights into this issue!
I am trying to use the DuckDB library in a SwiftUI app. It does not seem to be able to allocate as much memory as it is available to it, and I was wondering if there is any setting I can tweak. I've logged the issue in their GitHub repo too, with full details.
I can't seem to find much documentation on the observe(_:options:changeHandler:) method itself. There is this page which has some example use, but there is no link to a 'real' documentation page for the method. Additionally the 'quick help' has a little bit of info, but no parameter definitions or explanation of if/how the return value or changeHandler closure are retained.
Have a rare crash in one of our framework which I am unable to figure out.
Attaching the crash log here. Any idea how to debug this to find the root cause?
Crash.crash
Hello,
I.m deaf-blind programmer.
I'm experiencing memory issues in my app. Essentially, I'm writing a video.
In this output video, I get content from two sources.
The first source is an already recorded video of 18 seconds (just for testing). It will be shown at the beginning of the output video.
The second source is an array with photos and another array with audio buffers from AVSpeechSynthesizer.write(). The photos will be added along with the audio buffers to the output video, right after adding the 18-second video.
So, in the end, the output video should be:
18-second video + array of photos as video images and, for audio, the buffers from AVSpeechSynthesizer.write().
However, my app crashes as soon as I start the first process.
I'm using AVAssetWriter to write the video and AVAssetReader to read the video.
Below, I'll show the code where
I get the CMSampleBuffer.
I'd like an example of how to add the 18-second video to the beginning of the output video.
It doesn't need to be a big piece of code.
Here it is:
// Variables
var audioReaderBuffers = [CMSAMPLEBUFFER]()
var videoReaderBuffers = [(frame: CVPixelBuffer, time: CMTIME)]()
// Get CMSampleBuffer of a video asset
if let videoURL = videoURL {
let videoAsset = AVAsset(url: videoURL)
Task {
let videoAssetTrack = try await videoAsset.loadTracks(withMediaType: .video).first!
let audioTrack = try await videoAsset.loadTracks(withMediaType: .audio).first!
let reader = try AVAssetReader(asset: videoAsset)
let videoSettings = [
kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey: videoAssetTrack.naturalSize.width,
kCVPixelBufferHeightKey: videoAssetTrack.naturalSize.height
] as [String: Any]
let readerVideoOutput = AVAssetReaderTrackOutput(track: videoAssetTrack, outputSettings: videoSettings)
let audioSettings = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2
] as [String : Any]
let readerAudioOutput = AVAssetReaderTrackOutput(track: audioTrack,
outputSettings: audioSettings)
reader.add(readerVideoOutput)
reader.add(readerAudioOutput)
reader.startReading()
// Video CMSampleBuffer
while let sampleBuffer = readerVideoOutput.copyNextSampleBuffer() {
autoreleasepool {
if let imgBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
let pixBuf = imgBuffer as CVPixelBuffer
let pTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
videoReaderBuffers.append((frame: pixBuf, time: pTime))
}
}
}
if let videoURL = videoURL {
let videoAsset = AVAsset(url: videoURL)
Task {
let videoAssetTrack = try await videoAsset.loadTracks(withMediaType: .video).first!
let audioTrack = try await videoAsset.loadTracks(withMediaType: .audio).first!
let reader = try AVAssetReader(asset: videoAsset)
let videoSettings = [
kCVPixelBufferPixelFormatTypeKey: kCVPixelFormatType_32BGRA,
kCVPixelBufferWidthKey: videoAssetTrack.naturalSize.width,
kCVPixelBufferHeightKey: videoAssetTrack.naturalSize.height
] as [String: Any]
let readerVideoOutput = AVAssetReaderTrackOutput(track: videoAssetTrack, outputSettings: videoSettings)
let audioSettings = [
AVFormatIDKey: kAudioFormatLinearPCM,
AVSampleRateKey: 44100,
AVNumberOfChannelsKey: 2
] as [String : Any]
let readerAudioOutput = AVAssetReaderTrackOutput(track: audioTrack,
outputSettings: audioSettings)
reader.add(readerVideoOutput)
reader.add(readerAudioOutput)
reader.startReading()
while let sampleBuffer = readerVideoOutput.copyNextSampleBuffer() {
autoreleasepool {
if let imgBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) {
let pixBuf = imgBuffer as CVPixelBuffer
let pTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer)
}
My app does not automatically switch languages (voices) in VoiceOver when I have VoiceOver on and the screen includes both English and Spanish content. Instead of switching between the correctly accented voice, whatever my manual Voices rotor setting is, that's what the content is announced as. I can manually switch the Voice in the rotor to make words sound inteligible but my main concern is that language changes are not auto-detected even though that feature in my Settings is on.
VO does detect language changes in other apps, so I think there must be either misplaced or missing accessibiiltyLanguage strings somewhere in my app. Or is it more than that for localization considerations?
I reached out to the Apple Accessibilty team and was directed to open a ticket here, as my question is about the underlying code.
I am a novice developer and primarily accessibility SME; i expect that wnen "detect languages" is on in the user settings for VoiceOver, that the voice for the screen reader speech output will automatically switch to the correct language / accent. I recognize there is a problem but am not sure where the breakdown is. I would like guidance how to fix it to relay to my teams.
https://developer.apple.com/documentation/objectivec/nsobject/1615192-accessibilitylanguage
I am trying to understand why I am seeing crash reports for my code that creates a view from a NIB like the code below. The crash occurs when referencing contentView which should have been bound when the NIB was loaded.
Am I missing something here other than checking the result of the loadNibNamed function call?
class MyView: NSView {
@IBOutlet var contentView: NSView!
init() {
super.init(frame: NSRect(x: 0, y: 0, width: 84.0, height: 49.0))
commonInit()
}
required init?(coder: NSCoder) {
fatalError("init(coder:) has not been implemented")
}
private func commonInit() {
Bundle.main.loadNibNamed("MyView", owner: self, topLevelObjects: nil)
translatesAutoresizingMaskIntoConstraints = false
contentView.translatesAutoresizingMaskIntoConstraints = false
addSubview(contentView)
contentView.frame = self.bounds
}
Hello everyone,
I've built a @CurrentValue property wrapper that mimics the behavior of @Published, allowing a property to publish values on "did set". I've also created my own assign(to:) implementation that works with @CurrentValue properties, allowing values to be assigned from a publisher to a @CurrentValue property.
However, I'm running into an issue. When I use this property wrapper with two classes and the source class (providing the publisher) is not stored as a property, the subscription is deallocated, and values are no longer forwarded.
Here's the property wrapper code:
@propertyWrapper
public struct CurrentValue<Value> {
/// A publisher for properties marked with the `@CurrentValue` attribute.
public struct Publisher: Combine.Publisher {
public typealias Output = Value
public typealias Failure = Never
/// A subscription that forwards the values from the CurrentValueSubject to the downstream subscriber
private class CurrentValueSubscription<S>: Subscription where S: Subscriber, S.Input == Output, S.Failure == Failure {
private var subscriber: S?
private var currentValueSubject: CurrentValueSubject<S.Input, S.Failure>?
private var cancellable: AnyCancellable?
init(subscriber: S, publisher: CurrentValue<Value>.Publisher) {
self.subscriber = subscriber
self.currentValueSubject = publisher.subject
}
func request(_ demand: Subscribers.Demand) {
var demand = demand
cancellable = currentValueSubject?.sink { [weak self] value in
// We'll continue to emit new values as long as there's demand
if let subscriber = self?.subscriber, demand > 0 {
demand -= 1
demand += subscriber.receive(value)
} else {
// If we have no demand, we'll cancel our subscription:
self?.subscriber?.receive(completion: .finished)
self?.cancel()
}
}
}
func cancel() {
cancellable = nil
subscriber = nil
currentValueSubject = nil
}
}
/// A subscription store that holds a reference to all the assign subscribers so we can cancel them when self is deallocated
fileprivate final class AssignSubscriptionStore {
fileprivate var cancellables: Set<AnyCancellable> = []
}
fileprivate let subject: CurrentValueSubject<Value, Never>
fileprivate let assignSubscriptionStore: AssignSubscriptionStore = .init()
fileprivate var value: Value {
get {
subject.value
}
nonmutating set {
subject.value = newValue
}
}
init(_ initialValue: Output) {
self.subject = .init(initialValue)
}
public func receive<S>(subscriber: S) where S: Subscriber, Failure == S.Failure, Output == S.Input {
let subscription = CurrentValueSubscription(subscriber: subscriber, publisher: self)
subscriber.receive(subscription: subscription)
}
}
public var wrappedValue: Value {
get { publisher.value }
nonmutating set { publisher.value = newValue }
}
public var projectedValue: Publisher {
get {
publisher
}
mutating set {
publisher = newValue
}
}
private var publisher: Publisher
public init(wrappedValue: Value) {
publisher = .init(wrappedValue)
}
}
/// A subscriber that receives values from an upstream publisher and assigns them to a downstream CurrentValue property.
private final class AssignSubscriber<Input>: Subscriber, Cancellable {
typealias Failure = Never
private var receivingSubject: CurrentValueSubject<Input, Never>?
private weak var assignSubscriberStore: CurrentValue<Input>.Publisher.AssignSubscriptionStore?
init(currentValue: CurrentValue<Input>.Publisher) {
self.receivingSubject = currentValue.subject
self.assignSubscriberStore = currentValue.assignSubscriptionStore
}
func receive(subscription: Subscription) {
// Hold a reference to the subscription in the downstream publisher
// so when it deallocates, the susbcription is automatically cancelled
assignSubscriberStore?.cancellables.insert(AnyCancellable(subscription))
subscription.request(.unlimited)
}
func receive(_ input: Input) -> Subscribers.Demand {
receivingSubject?.value = input
return .none
}
func receive(completion: Subscribers.Completion<Never>) {
// Nothing to do here
}
public func cancel() {
receivingSubject = nil
assignSubscriberStore = nil
}
}
public extension Publisher where Self.Failure == Never {
/// Assigns the output of the upstream publisher to a downstream CurrentValue property
/// - Parameter currentValue: The CurrentValue property to assign the values to
func assign(to currentValue: inout CurrentValue<Self.Output>.Publisher) {
let subscriber = AssignSubscriber(currentValue: currentValue)
self.subscribe(subscriber)
}
}
Here’s an example demonstrating the issue, where two classes are used: Source, which owns the @CurrentValue property, and Forwarder1, which subscribes to updates from Source:
final class Source {
@CurrentValue public private(set) var value: Int = 1
func update(value: Int) {
self.value = value
}
}
final class Forwarder1 {
@CurrentValue public private(set) var value: Int
init(source: Source) {
self.value = source.value
source.$value.dropFirst().assign(to: &$value)
// The source is not stored as a property, so the subscription deallocates
}
func update(value: Int) {
self.value = value
}
}
With this setup, if source isn’t retained as a property in Forwarder1, the subscription is deallocated prematurely, and value in Forwarder1 stops receiving updates from Source.
However, this doesn’t happen with @Published properties in Combine. Even if source isn’t retained, @Published subscriptions seem to stay active, propagating values as expected.
My Questions:
What does Combine do internally with @Published properties that prevents the subscription from being deallocated prematurely, even if the publisher source isn’t retained as a property?
Is there a recommended approach to address this in my custom property wrapper to achieve similar behavior, ensuring the subscription isn’t lost?
Any insights into Combine’s internals or suggestions on how to resolve this would be greatly appreciated. Thank you!