Some Macs recently received a macOS system update which disabled the simulator runtimes used by Xcode 15, including the simulators for iOS, tvOS, watchOS, and visionOS. If your Mac received this update, you will receive the following error message and will be unable to use the simulator:
The com.apple.CoreSimulator.SimRuntime.iOS-17-2 simulator runtime is not available.
Domain: com.apple.CoreSimulator.SimError
Code: 401
Failure Reason: runtime profile not found using "System" match policy
Recovery Suggestion: Download the com.apple.CoreSimulator.SimRuntime.iOS-17-2 simulator runtime from the Xcode
To resume using the simulator, please reboot your Mac. After rebooting, check Xcode Preferences → Platforms to ensure that the simulator runtime you would like to use is still installed. If it is missing, use the Get button to download it again.
The Xcode 15.3 Release Notes are also updated with this information.
Xcode
RSS for tagBuild, test, and submit your app using Xcode, Apple's integrated development environment.
Posts under Xcode tag
200 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
Xcode 16.2
Swift 6
macOS Sequoia 15.1
SwiftUI
I am a beginner.
If I understand we can add buttons to the default Menu Bar but not delete them. Am I right? If you do not need most of the buttons, how do you solve that?
I can add a button:
import SwiftUI
@main
struct NouMenuProvesApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
.commands {
CommandMenu("Custom") {
Button("Custom Action") {
print("Custom Action performed")
}
}
}
}
}
Can I delete a Menu Button or create a new simpler Menu Bar?
Not a question, but a surprise.
Did I miss something, but apparently there has been no new beta release (16.3) since Release of 16.2 on Dec 11. 2 months without betas is really unusual (in fact, it never happened and usually, next beta n+1 is even released before the final release of version n).
So does that mean 16.3 will be a major update ? Wait and see.
Dear Apple Developer Forum,
I have a question regarding the AVCaptureDevice on iOS. We're trying to capture photos in the best quality possible along with depth data with the highest accuracy possible. We were delighted when we saw AVCaptureDevice could be initialized with the AVMediaType=.depthData which works as expected (depthData is a part of the AVCapturePhoto). When setting to AVMediaType=.video, we still receive depth data (of same quality according to our own internal tests). That confused us.
Mind you, we set the device format and depth format as well:
private func getDeviceFormat() throws -> AVCaptureDevice.Format {
// Ensures high video format and an appropriate color profile.
let format = camera?.formats.first(where: {
$0.isHighPhotoQualitySupported &&
$0.supportedDepthDataFormats.count > 0 &&
$0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_420YpCbCr8BiPlanarFullRange
})
// Check and see if it's available.
guard format != nil else {
throw CaptureDeviceError.necessaryFormatNotAvailable
}
return format!
}
private func getDepthDataFormat(for format: AVCaptureDevice.Format) throws -> AVCaptureDevice.Format {
// Access the depth format.
let depthDataFormat = format.supportedDepthDataFormats.first(where: {
$0.formatDescription.mediaSubType.rawValue == kCVPixelFormatType_DepthFloat32
})
// Check if it exists
guard depthDataFormat != nil else {
throw CaptureDeviceError.necessaryFormatNotAvailable
}
// Returns it.
return depthDataFormat!
}
We're wondering, what steps we can take to ensure the best quality photo, along with the most accurate depth data? What properties are the most important, which have an effect, which don't? Are there any ways we can optimize our current configuration? We find it difficult as there's very limited guides and explanations on the media subtypes, for example kCVPixelFormatType_420YpCbCr8BiPlanarFullRange. Is it the best? Is it the best for our use case of high quality photo + most accurate depth data?
Important comment: Our App only runs on iPhone 14 Pro, iPhone 15 Pro, iPhone 16 Pro on the latest iOS versions.
We hope someone with greater knowledge at Apple can help us and guide us on how we can have the photos of best quality and depth data with most accuracy.
Thank you very much!
Kind regards.
Latest version of Xcode 16.1.
I have an existing package dependency which is sitting on a git@ssh.dev.azure.com account.
So, now whenever I remove that package dependency, I can no longer add it within the Xcode UI. Just no possible way to add it or find it in the Search or Enter Package URL text field.
How on earth are we meant to add SSH packages now?
Anyone else have this issue? If so, have you found a work-around without having to manually edit the package dependencies in the project?
Hi,
Lately, I've been getting the same window each time I install an app for testing. Anyone else?
Hi all,
I'm facing the issue that cannot build app in Xcode 16
Command PhaseScriptExecution failed with a nonzero exit code
Please help to fix the issue, there is any reference how to resolve the error.
We have developed a custom iOS framework called PaySDK. Earlier we distributed the framework as PaySDK.xcframework.zip through GitHub (Private repo) with two dependent xcframeworks.
Now, one of the clients asking to distribute the framework through Swift Package Manager.
I have created a new Private repo in the GitHub, created the new release (iOSSDK_SPM_Test) tag 1.0.0. Uploaded the below frameworks as Assets and updated the downloadable path in the Package.Swift and pushed to the GitHub Main branch.
PaySDK.xcframework.zip
PaySDKDependentOne.xcframework.zip
PaySDKDependentTwo.xcframework.zip
When I try to integrate (testing) the (https://github.com/YuvaRepo/iOSSDK_SPM_Test) in Xcode, am not able to download the frameworks, the downloadable path is pointing to some old path (may be cache - https://github.com/YuvaRepo/iOSSDK_SPM/releases/download/1.2.0/PaySDK.xcframework.zip).
Package.Swift:
// swift-tools-version:5.3
import PackageDescription
let package = Package(
name: "iOSSDK_SPM_Test",
platforms: [
.iOS(.v13)
],
products: [
// Products define the executables and libraries a package produces, making them visible to other packages.
.library(
name: "iOSSDK_SPM_Test",
targets: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"]
)
],
targets: [
// Targets are the basic building blocks of a package, defining a module or a test suite.
// Targets can depend on other targets in this package and products from dependencies.
.binaryTarget(
name: "PaySDK",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDK.xcframework.zip",
checksum: " checksum "
),
.binaryTarget(
name: "PaySDKDependentOne",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentOne.xcframework.zip",
checksum: " checksum "
),
.binaryTarget(
name: "PaySDKDependentTwo",
url: "https://github.com/YuvaRepo/iOSSDK_SPM_Test/releases/download/1.0.0/PaySDKDependentTwo.xcframework.zip",
checksum: " checksum "
),
.testTarget(
name: "iOSSDK_SPM_TestTests",
dependencies: ["PaySDK", "PaySDKDependentOne", "PaySDKDependentTwo"]
)
]
)
Steps I followed:
I have tried below steps,
Removed the local repo and cloned new
rm -rf ~/Library/Caches/org.swift.swiftpm/
rm -rf ~/Library/Developer/Xcode/DerivedData/*
Can anyone help to identify the issue and resolve? Thanks in advance.
Hello everyone,
I'm experiencing an issue with my app, and I would greatly appreciate any guidance. Here's the situation:
My app works flawlessly on the simulator for all iPhone models (16, 16 Pro, 16 Pro Max, SE, etc.).
It runs without issues on physical iPhone 11 devices (both mine and a friend's).
However, when my friend installs the app from the App Store on their iPhone 15 Pro or iPhone 16, it crashes immediately upon opening.
Steps I’ve taken so far:
Verified that the app works on the simulator for the same models where it crashes in real life.
Updated the app's minimum deployment target to iOS 18, but this did not resolve the issue. The app still crashes on the physical iPhone 15 Pro and 16 models while continuing to run fine on previous physical devices and all simulated ones.
Additionally, I had another friend beta test the app through TestFlight on their iPhone 15 Pro, and they received an alert message indicating that the app had crashed when they tried to open it. This confirms the issue still exists after my attempted fixes.
I'm unsure what could be causing this discrepancy between the simulated and physical devices, or what the alert message in TestFlight might signify. Has anyone encountered a similar issue, or does anyone have suggestions on what I should do to fix this?
Thanks in advance for your help!
I can connect before update macOS and xcode
Hi,
My env. is ..
Xcode: Version 16.2 (16C5032a)
macOS Sequoia: Version 15.1
And I have 2 problems.
Please give me the advice..
Failed Message.
When I run the automatically generated app as it is, the following error(warning?) message appears in the terminal.
Can't find or decode reasons
Failed to get or decode unavailable reasons
NSBundle file:///System/Library/PrivateFrameworks/MetalTools.framework/ principal class is nil because all fallbacks have failed
Not on the simulator
And the result is not running in the simulator, but instead appears as a window. (The simulator works fine when launched separately, but the app from the current project doesn’t show up in it.)
Hello,
I am getting an EXC_BREAKPOINT and trying figure out how to debug this crash.
Some background on func getDecodeResults(noOfDecodes: ) which the crashlog is pointing to:
I believe this crash might be related to the unsafe pointers I'm using for communication between our C code and Swift. Below are the relevant implementation details:
I create an UnsafeMutablePointer for a struct type DECODE_RESULTS, defined internally.
The Swift code allocates fixed memory for this pointer, which is then passed to a C library for initialization and modification.
In the getDecodeResults function (where the crash log points), the first step is converting a C array starting at the UnsafeMutablePointer into a Swift array.
Here’s a snippet for context:
// In class initialization:
private var decodeResults: UnsafeMutablePointer<DECODE_RESULTS>
decodeResults = UnsafeMutablePointer<DECODE_RESULTS>.allocate(capacity: maxNumberOfBarcodes)
// Initialize decodeResults in the C library
// In getDecodeResults function:
let results = Array(UnsafeBufferPointer(start: decodeResults, count: Int(noOfDecodes)))
Crash Log:
Thread 6 name: Dispatch queue: com.padlocdocks.padlocscan.PLSBarcodeScanner._scanQueue
Thread 6 Crashed:
0 AilaDecoder 0x101d995a8 DecoderManager.getDecodeResults(noOfDecodes:) (in AilaDecoder) (DecoderManager.swift:0) + 38312
I'm unsure if (DecoderManager.swift:0) indicates the root cause or if it’s an erroneous reference.
01 AilaDemo-2025-01-17-041022_symbolicated.txt
I discovered a disk space issue with Xcode on my 1TB hard disk. Xcode doesn’t delete old or unused Simulators, causing them to mix and consume valuable space. Deleting them from the Window > Devices & Simulators menu didn’t work, so I used Disk Utility to find and delete them. However, they persisted.
To regain disk space, I completely removed Xcode and the Developer Folder. After reinstalling Xcode, I regained about 80% of my disk space. Despite unmounting the Simulators, they remained.
For those unable or unwilling to do this, the Mac App Store offers DevCleaner, a utility that removes old Xcode Cache and Simulators.
Thanks. I hope I've helped someone else.
Dan Uff
Xcode 16.2 are unavailable to develop widget with dynamic options, no matter SiriIntent or AppIntent that I try to use.
I have try to start a complete new project, then add widget with app intent checkbox check, with zero code changed then i press command run and WidgetKit Simulator present CHSErrorDomain error 1103 always, if i try to add widget directly from desktop, dynamic options are available to select but widgets doesn't seem like load successfully, it is stuck same in the WidgetKit Simulator.
I also try to start a new project in my other MacBook but no luck, this error are presenting all the time, I'm totally stuck here, does anybody having this issue?
PLATFORM AND VERSION
Vision OS
Development environment: Xcode 16.2, macOS 15.2
Run-time configuration: visionOS 2.3 (On Real Device, Not simulator)
Please someone confirm I'm not crazy and this issue is actually out of my control.
Spent hours trying to fix my app and running profiles because thought it was an issue related to my apps performance. Finally considered chance it was issue with API itself and made sample app to isolate problem, and it still existed in it. The issue is when a model entity moves around in a full space that was launched when the system environment immersion was turned up before opening it, the entities looks very choppy as they move around. If you take off the headset while still in the space, and put it back on, this fixes it and then they move smoothly as they should. In addition, you can also leave the space, and then turn the system environment immersion all the way down before launching the full space again, this will also make the entity moves smoothly as it should. If you launch a mixed immersion style instead of a full immersion style, this issue never arrises. The issue only arrises if you launch the space with either a full style, or progressive style, while the system immersion level is turned on.
STEPS TO REPRODUCE
https://github.com/nathan-707/ChoppyEntitySample
Open my test project, its a small, modified vision os project template that shows it clearly.
otherwise:
create immersive space with either full or progressive immersion style.
setup a entity in kinematic mode, apply a velocity to it to make it pass over your head when the space appears.
if you opened the space while the Apple Vision Pros system environment was turned up, the entity will look choppy.
if you take the headset off while in the same space, and put it back on, it will fix the issue and it will look smooth.
alternatively if you open the space with the system immersion environment all the way down, you will also not run into the issue. Again, issue also does not happen if space launched is in mixed style.
I am trying to setup an extension using DNSProxyProvider that intercepts the DNS traffic on UDP and inserts our custom device identifier and send it to our custom DNS Server which gives us the response which I forward to the requesting client.
I have been able to append the identifier with the domain name when sending out request to our custom DNS and I am getting the response back just fine but when I try to write the response to the udpflow I get this error in Console Logs.
Error Domain=NEAppProxyFlowErrorDomain Code=9 "The datagram was too large" UserInfo={NSLocalizedDescription=The datagram was too large}
Here is what I have tried so far.
Truncating the datagram size to less than 10 bytes.
Sending in dummy Data object while trying to write to the flow.
Double checking the Signing and Capabilities, for Targets, the App and Network Extension.
Attached below is code from my NEDNSProxyProvider. The DNS request is process in the handleNewFlow function which calls processUDPFlow
override func handleNewFlow(_ flow: NEAppProxyFlow) -> Bool {
if flow is NEAppProxyTCPFlow {
NSLog("BDDNSProxyProvider : Is TCP Flow...")
} else if let udpFlow = flow as? NEAppProxyUDPFlow {
NSLog("BDDNSProxyProvider: handleNewFlow : \(udpFlow)")
processUDPFlow(udpFlow) // < --
}
return true
}
In the code below I concatenate domain name in the request with deviceId and send it to our server. Also have the Logs lines in, please ignore them.
// Read incoming DNS packets from the client
self.udpAppProxyFlow = udpFlow
udpFlow.readDatagrams { datagrams, error in
if let error = error {
NSLog("Error reading datagrams: \(error.localizedDescription)")
return
}
guard let datagrams = datagrams else {
NSLog("No datagrams received.")
return
}
// Forward each DNS packet to the custom DNS server
for (index, packet) in datagrams.enumerated() {
let dnsMessage = self.parseDNSMessage(from: packet.0)
NSLog("tDatagram Header: \(dnsMessage.header)")
for question in dnsMessage.questions {
NSLog("tDatagram Question: \(question.name), Type: \(question.type), Class: \(question.klass)")
}
for answer in dnsMessage.answers {
NSLog("tDatagram Answer: \(answer.name), Type: \(answer.type), Data: \(answer.data)")
}
let oldDomain = self.extractDomainName(from: packet.0)!
let packetWithNewDomain = self.replaceDomainName(in: packet.0, with: "827-\(oldDomain)") // func to append device ID
NSLog("Packet's new domain \(self.extractDomainName(from: packetWithNewDomain ?? packet.0) ?? "Found nil")")
self.sendToCustomDNSServer(packetWithNewDomain!) { responseDatagram in
guard let responseDatagram = responseDatagram else {
NSLog("Failed to get a response from the custom DNS server")
return
}
let tDatagram = (responseDatagram, packet.1)
udpFlow.writeDatagrams([tDatagram]) { error in
if let error = error {
NSLog("Failed to write DNS response back to client: \(error)")
} else {
NSLog("Successfully wrote DNS response back to client.")
}
}
}
}
// Continue Reading Datagrams - DO NOT REMOVE!
self.processUDPFlow(udpFlow)
}
}
Following is the function I use to replace domainName
// Ensure the datagram is at least the size of the DNS header
guard datagram.count > 12 else {
NSLog("Error : Invalid datagram: Too small to contain a DNS header")
return nil
}
NSLog("BDLine 193")
// Start reading after the header (12 bytes)
var offset = 12
// Parse the original domain name
while offset < datagram.count {
let length = Int(datagram[offset]) // Get the length of the next label
offset += 1
// Check for the null terminator (end of domain name)
if length == 0 {
// Domain name ends here
break
}
// Validate that the length is within bounds
guard offset + length <= datagram.count else {
NSLog("Error : Invalid datagram: Domain name length exceeds packet size")
return nil
}
// Skip over this label
offset += length
}
Everything is falling into place other than this last Error I get when I try to write back to flow. What am I missing here and how can I resolve this issue?
Any help would be appreciated.
Thanks
I want to set the minimum deployment to 16.0, however Xcode (16.2) won't let me select that.
In the drop down box it shows 18,17,16,15, however if any of these is selected it sets them as 18.6, 17.6, 16.6 or 15.6 (see image)
If an attempt is made to edit the value manually, to 16.0, then after change it, Xcode just deletes that value and sets it to nothing.
What's going on here? Why is Xcode only allowing the version other than be something.6 and why will it not let you manually edit it?
Xcode 16.2 suddenly injects the UIRequiredDeviceCapabilities key into our app's Info.plist. This results in a rejection from App Store Connect because of this key:
TMS-90109: This bundle is invalid - The key UIRequiredDeviceCapabilities in the Info.plist may not contain values that would prevent this application from running on devices that were supported by previous versions. Refer to QA1623 for additional information: https://developer.apple.com/library/ios/#qa/qa1623/_index.html
The setting INFOPLIST_KEY_UIRequiredDeviceCapabilities is empty in our project, yet Xcode still injects:
<key>UIRequiredDeviceCapabilities</key>
<array>
<string>arm64</string>
</array>
How can we prevent that? Or is there a way to strip it out during the build process? The Info.plist seems to get generated during an invisible build step, is there a way to make this explicit?
I recently upgraded to macOS 14.7.2, triggering an upgrade to Xcode 16.2 and its command line tools. My open source project builds and runs fine in Xcode. But when I tested the CMake build, it failed looking for standard c++ headers.
I asked on stackoverflow but no solutions so far.
Any thoughts here?
We are building an app which can reads texts. It can read english and Japanese normal texts successfully. But in some cases, we need to read Japanese tategaki (vertically aligned texts). But in that times, the same code gives no output. So, is there any need to change any configuration to read Japanese tategaki? Or is it really possible to read Japanese tategaki using vision framework?
lazy var detectTextRequest = VNRecognizeTextRequest { request, error in
self.resStr="\n"
self.words = [:]
// Get OCR result
guard let res = request.results as? [VNRecognizedTextObservation] else { return }
// separate the words by space
let text = res.compactMap({$0.topCandidates(1).first?.string}).joined(separator: " ")
var n = 0
self.wordArr=[[]]
self.xs = 1
self.ys = 1
var hs = 0.0 // To compare the heights of the words
// To get the original axis (top most word's axis), only once
for r in res {
var word = r.topCandidates(1).first?.string
self.words[word ?? ""] = [r.topLeft.x, r.topLeft.y]
if(self.cartLabelType == 1){
if(word?.components(separatedBy: CharacterSet(charactersIn: "//")).count ?? 0>2){
self.xs = r.topLeft.x
self.ys = r.topLeft.y
}
}
}
}
}
I’m working on a project in Xcode 16.2 and encountered an issue where getAPI() with a default implementation in a protocol extension doesn’t show up in autocomplete. Here’s a simplified version of the code:
import Foundation
public protocol Repository {
func getAPI(from url: String?)
}
extension Repository {
public func getAPI(from url: String? = "https://...") {
getAPI(from: url)
}
}
final class _Repository: Repository {
func getAPI(from url: String?) {
// Task...
}
}
let repo: Repository = _Repository()
repo.getAPI( // Autocomplete doesn't suggest getAPI()
I’ve tried the following without success:
• Clean build folder
• Restart Xcode
• Reindexing
Is there something wrong with the code, or is this a known issue with Xcode 16.2? I’d appreciate any insights or suggestions.