Explore the core architecture of the operating system, including the kernel, memory management, and process scheduling.

Post

Replies

Boosts

Views

Activity

Issue with IOServiceOpen
I have an app that is used to control features of a device with a driverkit driver. I am having trouble creating a connection a certain device. The Sample code from "Communicating between a DriverKit extension and a client app. The sample code shows: ret = IOServiceGetMatchingServices(kIOMainPortDefault, IOServiceNameMatching(kDextIdentifier), &iterator); I cannot use kDextIdentifier but need to find a service with a certain BSD Name. So in this case I try: ret = IOServiceGetMatchingServices(kIOMainPortDefault, IOBSDNameMatching(kIOMasterPortDefault, NULL, interface), &iterator); In each case the call completes correctly, and we get an iterator. I can also use IOServiceGetMatchingService with IOBSDNameMatching, and that completes correctly as well. However when I attempt IOServiceOpen with the first case, the connection is created correctly. However, I have four of these in the machine, and I need to select the service and subsequently connection for a certain BSD name. When I attempt the IOServiceOpen with the second/third calls, the IOServiceOpen call fails with error 0x2c7 which is unsupported. Is there an entitlement I need to make this work?
1
0
220
Sep ’24
iOS 18 crash
Bootstrapping failed for <FBApplicationProcess: 0x52ed3f180; app:> with error: <NSError: 0x3020c42a0; domain: RBSRequestErrorDomain; code: 5; "Launch failed."> { NSUnderlyingError = <NSError: 0x3020c4630; domain: NSPOSIXErrorDomain; code: 85> { NSLocalizedDescription = Launchd job spawn failed; }; }
1
0
231
Sep ’24
Message filter extension doesn't run on iOS 18
If I run an app with a message filter extension on < iOS 18 everything is as expected, if I run the same app, without any changes on iOS 18 then it doesn't work. I've discovered that problems occur if the extension has the following code: extension MessageFilterExtension: ILMessageFilterQueryHandling, ILMessageFilterCapabilitiesQueryHandling { func handle(_ capabilitiesQueryRequest: ILMessageFilterCapabilitiesQueryRequest, context: ILMessageFilterExtensionContext, completion: @escaping (ILMessageFilterCapabilitiesQueryResponse) -> Void) { let response = ILMessageFilterCapabilitiesQueryResponse() response.transactionalSubActions = [.transactionalCarrier, .transactionalHealth, .transactionalPublicServices, .transactionalFinance, .transactionalWeather, .transactionalRewards, .transactionalOrders, .transactionalOthers, .transactionalReminders] response.transactionalSubActions = [.transactionalFinance, .transactionalOrders, .transactionalHealth] completion(response) } This code doesn't run on iOS 18, however the following code does run on iOS 18: let response = ILMessageFilterCapabilitiesQueryResponse() completion(response) I downloaded several apps from the app store which provide message filtering, within the Message app they all had one thing in common, on < iOS 18 they all show 12 filtering categories, but within iOS 18 they only show 2. So it seems the issue is endemic and effects other apps, not just mine.
3
1
333
Sep ’24
CoreHID HidVirtualDevice Returning Nil
I'm trying to follow the guide for creating a virtual device here. My goal was to see the set report print the data (and to eventually create a virtual gamepad and dispatch input reports of button presses). I am on MacOS v15.0 and I have CoreHID.framework linked (not embedded). However, when using the sample code below, HIDVirtualDevice returns nil. import Foundation import CoreHID // This describes a keyboard device according to the Human Interface Devices standard. let keyboardDescriptor: Data = Data([0x05, 0x01, 0x09, 0x06, 0xA1, 0x01, 0x05, 0x07, 0x19, 0xE0, 0x29, 0xE7, 0x15, 0x00, 0x25, 0x01, 0x75, 0x01, 0x95, 0x08, 0x81, 0x02, 0x95, 0x01, 0x75, 0x08, 0x81, 0x01, 0x05, 0x08, 0x19, 0x01, 0x29, 0x05, 0x95, 0x05, 0x75, 0x01, 0x91, 0x02, 0x95, 0x01, 0x75, 0x03, 0x91, 0x01, 0x05, 0x07, 0x19, 0x00, 0x2A, 0xFF, 0x00, 0x95, 0x05, 0x75, 0x08, 0x15, 0x00, 0x26, 0xFF, 0x00, 0x81, 0x00, 0x05, 0xFF, 0x09, 0x03, 0x75, 0x08, 0x95, 0x01, 0x81, 0x02, 0xC0]) let properties = HIDVirtualDevice.Properties(descriptor: keyboardDescriptor, vendorID: 1) let device = HIDVirtualDevice(properties: properties) final class Delegate : HIDVirtualDeviceDelegate { // A handler for system requests to send data to the device. func hidVirtualDevice(_ device: HIDVirtualDevice, receivedSetReportRequestOfType type: HIDReportType, id: HIDReportID?, data: Data) throws { print("Device received a set report request for report type:\(type) id:\(String(describing: id)) with data:[\(data.map { String(format: "%02x", $0) }.joined(separator: " "))]") } // A handler for system requests to query data from the device. func hidVirtualDevice(_ device: HIDVirtualDevice, receivedGetReportRequestOfType type: HIDReportType, id: HIDReportID?, maxSize: size_t) throws -> Data { print("Device received a get report request for report type:\(type) id:\(String(describing: id))") assert(maxSize >= 4) return (Data([1, 2, 3, 4])) } } if (device != nil) { print("Device is ready") await device?.activate(delegate: Delegate()) try await device?.dispatchInputReport(data: Data([5, 6, 7, 8]), timestamp: SuspendingClock.now) } else { print("Device not created") }
4
0
213
Sep ’24
Sequoia - incorrect alert about "app wants to access data from other app
We have a sync solution with two apps that are an application-group. I use an NSUserDefaults object opened with initWithSuiteName to share config data between the two. That has not been a problem in the past. However,in the Sequoia Beta 7 (24a5327a), when I read or write to the NSUserDefaults from either app, the " would like to access data from other apps" dialog appears. This occurs on every restart on our test systems and more often for some of our customers testing (perhaps with older betas). The warning is not indicative of what I'm doing, and the fact that it comes up every single launch will freak out customers. There was a similar issue opened in Beta 4 that was marked fixed in beta 5: https://developer.apple.com/forums/thread/760072?answerId=799001022#799001022 I posted this in the beta feedback forums before GA but got no response.
5
1
292
Sep ’24
iOS 18 Different Behavior in CallKit with Dynamic Island Phones
We’ve encountered significant differences in our VoIP app’s workflow between iOS 17 and iOS 18, particularly on devices with Dynamic Island. Our app functioned as expected in iOS 17 and even iOS 18 on devices without Dynamic Island, but with the public release of iOS 18 on Dynamic Island-supported iPhones, we’re receiving a lot of feedback from users reporting confusing behavior during both incoming and outgoing calls. Here’s a breakdown of the issue: In iOS 17 and iOS 18 (on non-Dynamic Island models), when our app initiates a second call, the app remains in the foreground, allowing the user to see both calls within our UI. Users can manage their calls, swap between them, and even merge them, all without leaving the app. If the user switches to the CallKit controller via the app switcher, it reflects the state of the calls in our app, which has worked well for our users. However, on iOS 18 devices with Dynamic Island, adding a second call now backgrounds our app and brings the CallKit controller UI to the foreground. This causes confusion, as users lose context of managing multiple calls within our app. Though users can switch back to our app manually, this extra step disrupts their workflow, especially when handling multiple active calls. This change doesn't occur on iOS 18 devices without Dynamic Island. Additionally, during an active call, if an incoming call arrives, the Dynamic Island notification takes over, and the app UI is again backgrounded in favor of the CallKit controller screen. Even when declining the incoming call from the notification, the user remains in the CallKit controller UI, requiring another step to return to our app using the app switcher or the "app" button in CallKit. This inconsistent behavior makes call management unnecessarily complex for our users, who need a streamlined experience, especially since our app is used in healthcare communications that is often critical in nature. Is there any documentation on changes to CallKit behavior specific to Dynamic Island devices in iOS 18? We’d appreciate any guidance on designing a VoIP experience that doesn’t involve these disruptive view switches, particularly for Dynamic Island-supported iPhones. Thank you for your help!
2
0
434
Sep ’24
API to fetch dhcpv6 information
I am trying to fetch DHCP server identifiers of the current network. For IPv4 I am able to get this information using SCDynamicStoreCopyDHCPInfo and then using DHCPInfoGetOptionData fetching option 54. I am trying to do the same thing for IPv6, in my scutil I do see DHCPv6 present. Is there any API present which fetches this information for v6 DHCP servers, or do I have to get this direct from scutil?
5
0
292
Sep ’24
Need Help! ANCS Client
Hi I am looking for examples where a BLE Device (ESP32 or Arduino) can be a BLE Client and subscribe to iPhone ANCS Services. Unfortunately my code is UNABLE TO FIND my iPhone device even after scanning for nearby BLE Devices. ANCS Documentation (Not helpful): https://developer.apple.com/library/archive/documentation/CoreBluetooth/Reference/AppleNotificationCenterServiceSpecification/Specification/Specification.html I am able to create a BLE Device as a server but I am looking to subscribe to incoming call notifications. PLEASE HELP if you have done this before. Thanks
2
0
215
Sep ’24
Creating file bookmarks doesn't work anymore on macOS 15 Sequoia
Before updating to macOS 15 Sequoia, I used to be able to create file bookmarks with this code: let openPanel = NSOpenPanel() openPanel.runModal() let url = openPanel.urls[0] do { let _ = try url.bookmarkData(options: [.withSecurityScope]) } catch { print(error) } Now I get an error Error Domain=NSCocoaErrorDomain Code=256 "Failed to retrieve app-scope key" These are the entitlements: com.apple.security.app-sandbox com.apple.security.files.user-selected.read-write com.apple.security.files.bookmarks.app-scope Strangely, my own apps continued working, after updating to macOS 15 some days ago, until a few moments ago. Then it seems that all of a sudden my existing bookmarks couldn't be resolved anymore, and no new bookmarks could be created. What could be the problem?
16
3
1.4k
Sep ’24
scenePhase change on display cadence change
My Watch app is a workout app that has to be careful to minimize CPU load when backgrounded to get below the 15% avg/60 s in order to avoid being terminated by the system. Accordingly I have logic to detect a transition to the background using a .onChange(of: scenePhase) { phase in… } handler which works well for cases where the app is backgrounded - it is able to reduce its CPU load to just 6-9% while backgrounded. But a special case of the change to scenePhase presents a challenge - when an overlay such as the Control Center or a long-look notification presents, there is no transition to .background, only to .inactive. This is a problem because the system nevertheless imposes the background CPU limit on the app while it’s being covered by the overlay, even though the app didn’t get the .background transition on the scenePhase handler. To further complicate matters, it now seems with watchOS 11 that whenever a transition is made in the screen cadence from .live to .seconds, the scenePhase handler is called with a transition to .inactive. Which is a problem because this transition has to be distinguished from the case of the overlay presentation - the cadence change has no CPU usage implications as long as the app remains foregrounded, but the overlay presentation does, and the system will terminate the app if it exceeds 15% avg/60s. My question: how can I distinguish between the two causes of the transition to .inactive for the two cases: 1) when the overlay presents, and 2) when the display shifts to the slow cadence?
0
0
194
Sep ’24
Problems with SensorKit data calls
The Deligate 'didFetchResult' method of fetching data past 24 hours from SensorKit is not being called. It is confirmed that you have already granted full access to the SensorKit and that data on the Ambient value in the device's personal information -> research sensor & usage data are recorded. It is possible to export to an lz4 file. I want to have the data after 24 hours called to the app, but other Deligate methods are called, but only Deligate that gets the illumination value is not called. Is it understood that only data past 24 hours can be imported after startRecoding() is called? If so, in order to receive data past 24 hours, do I have to continue to receive the illumination data value in the background for more than 24 hours to receive the Ambient value afterwards? import Foundation import SensorKit import UIKit final class SensorKitManager: NSObject, ObservableObject, SRSensorReaderDelegate { static let shared = SensorKitManager() private let ambientReader = SRSensorReader(sensor: .ambientLightSensor) var availableDevices: [SRDevice] = [] @Published var ambientLightData: [AmbientLightDataPoint] = [] var isFetching = false var isRecordingAmbientLight = false private override init() { super.init() setupReaders() checkAndRequestAuthorization() } private func setupReaders() { ambientReader.delegate = self } // MARK: - Permission Request func requestAuthorization() { SRSensorReader.requestAuthorization(sensors: [.ambientLightSensor]) { [weak self] error in DispatchQueue.main.async { guard let self = self else { print("Permission request aborted") return } if let error = error { print("Permission request failed: \(error.localizedDescription)") } else { print("Permission request succeeded") self.startRecordingAmbientLightData() } } } } func checkAndRequestAuthorization() { let status = ambientReader.authorizationStatus switch status { case .authorized: print("Ambient light sensor access granted") startRecordingAmbientLightData() case .notDetermined: print("Ambient light sensor access undetermined, requesting permission") requestAuthorization() case .denied: print("Ambient light sensor access denied or restricted") @unknown default: print("Unknown authorization status") } } // MARK: - Ambient Light Data Logic func startRecordingAmbientLightData() { guard !isRecordingAmbientLight else { print("Already recording ambient light data.") return } print("Starting ambient light data recording") isRecordingAmbientLight = true ambientReader.startRecording() fetchAmbientLightData() fetchAmbientDeviceData() } func fetchAmbientLightData() { print("Fetching ambient light data") let request = SRFetchRequest() let now = Date() let fromTime = now.addingTimeInterval(-72 * 60 * 60) let toTime = now.addingTimeInterval(-25 * 60 * 60) request.from = SRAbsoluteTime(fromTime.timeIntervalSinceReferenceDate) request.to = SRAbsoluteTime(toTime.timeIntervalSinceReferenceDate) print("Fetch request: \(fromTime) ~ \(toTime)") ambientReader.fetch(request) } private func displayAmbientLightData(sample: SRAmbientLightSample) { print("Ambient light: \(sample.lux.value) lux") print("Current ambientLightData content:") for data in ambientLightData { print("Timestamp: \(data.timestamp), Lux: \(data.lux)") } } // MARK: - Device Data Logic private func fetchAmbientDeviceData() { print("Fetching device information") let request = SRFetchRequest() let now = Date() let fromDate = now.addingTimeInterval(-72 * 60 * 60) let toDate = now.addingTimeInterval(-24 * 60 * 60) request.from = SRAbsoluteTime(fromDate.timeIntervalSinceReferenceDate) request.to = SRAbsoluteTime(toDate.timeIntervalSinceReferenceDate) if availableDevices.isEmpty { print("No devices available") ambientReader.fetchDevices() } else { for device in availableDevices { print("Starting data fetch (Device: \(device))") request.device = device ambientReader.fetch(request) print("Fetch request sent (Device: \(device))") } } } // MARK: - SRSensorReaderDelegate Methods func sensorReader(_ reader: SRSensorReader, didFetch devices: [SRDevice]) { availableDevices = devices for device in devices { print("Fetched device: \(device)") } if !devices.isEmpty { fetchAmbientDeviceData() } } func sensorReader(_ reader: SRSensorReader, fetching fetchRequest: SRFetchRequest, didFetchResult result: SRFetchResult<AnyObject>) -> Bool { print("sensorReader(_:fetching:didFetchResult:) method called") if let ambientSample = result.sample as? SRAmbientLightSample { let luxValue = ambientSample.lux.value let timestamp = Date(timeIntervalSinceReferenceDate: result.timestamp.rawValue) // Check for duplicate data and add it if !ambientLightData.contains(where: { $0.timestamp == timestamp }) { let dataPoint = AmbientLightDataPoint(timestamp: timestamp, lux: Float(luxValue)) ambientLightData.append(dataPoint) print("Added ambient light data: \(luxValue) lux, Timestamp: \(timestamp)") } else { print("Duplicate data, not adding: Timestamp: \(timestamp)") } // Output data self.displayAmbientLightData(sample: ambientSample) } return true } func sensorReader(_ reader: SRSensorReader, didCompleteFetch fetchRequest: SRFetchRequest) { print("Data fetch complete") if ambientLightData.isEmpty { print("No ambient light data within 24 hours.") } else { print("ambientLightData updated") for dataPoint in ambientLightData { print("Added ambient light data: \(dataPoint.lux) lux, Timestamp: \(dataPoint.timestamp)") } } } }
0
0
212
Sep ’24
FileProvider Extension's Version is not matching in Mac
Hello, I would like to know why my FileProvider. apex application cannot run on a Mac OS 12.0 computer after I compiled it on Mac OS 4.5. I have changed the macOS Deployment Target and Minimum Deployment in the XCode configuration file to 11.0, but it still cannot run The error is as follows:  2024-09-23 10:10:24.264067+0800 0xab312 Error 0x0 83438 0 O+Connect: (libFileProviders Manager. dylib) [com. oplus. DeviceSpace: FileProviders Manager] Unable to add domain, error: Error Domain=NSFileProviders Error Domain Code=-2001 "The application is currently unavailable. UserInfo={NSLocalizedDescription=The application is currently unavailable.} I want to know what this error means and how to solve it
2
0
238
Sep ’24
How to allow my app to receive BLE data from sensor while running in background?
I have a sensor that communicates via BLE periodically. The intention is for it to be a set it and forget it kind of thing. The user will check the app periodically but won't be doing so frequently. I need the app to be able to still receive the BLE data from the sensor even if it's running in the background so that when the user does check the app, they can see everything that's been happening. I've read a lot from 2020 - 2021 where it seems many developers are struggling with the same issue. Is there a supported way to do this now? How to have my app run in the background and still be able to receive BLE data?
1
0
220
Sep ’24
Detecting a backgrounded iOS-Peripheral
We are developing an app where we want iPhone to iPhone communication via bluetooth. We are using the core bluetooth API. One iPhone acts as the peripheral and the other as the central. When both apps are on screen they successfully find each other and can connect. However when the peripheral is in background the central can't find it. To be clear the central is still in foreground. I have consulted this page: https://developer.apple.com/library/archive/documentation/NetworkingInternetWeb/Conceptual/CoreBluetooth_concepts/CoreBluetoothBackgroundProcessingForIOSApps/PerformingTasksWhileYourAppIsInTheBackground.html and other resources. From my understanding the service UUID should be moved to the manufacturer data. This makes it difficult for non-iOS devices to detect the service but the documentation says that my central should be able to find the peripheral. I have added the required keys to the pList and am explicitly searching for the mentioned service UUID. When using a third party BLE Tool I'm able to find and communicate with the peripheral even when it is in background mode. I know that some option-keys will be ignored but from my understanding they should still find each other at least once. This was tested with two iOS 18 iPhones. Here is the code im using: import UIKit import CoreBluetooth class ViewController: UIViewController { private var centralManager: CBCentralManager! private var peripheralManager: CBPeripheralManager! override func viewDidLoad() { super.viewDidLoad() } @IBAction func btnPeripheral(_ sender: Any) { peripheralManager = CBPeripheralManager(delegate: self, queue: nil, options: nil) } @IBAction func btnCentral(_ sender: Any) { print("central go") centralManager = CBCentralManager(delegate: self, queue: nil) } } extension ViewController: CBCentralManagerDelegate { func centralManagerDidUpdateState(_ central: CBCentralManager) { print("central state: \(central.state)") switch central.state { case .poweredOn: print("central powered on") centralManager.scanForPeripherals(withServices: [CBUUID(string:"db9acb1e-1ac4-4f70-b58c-3b3dcea84703")], options: [CBCentralManagerScanOptionAllowDuplicatesKey: true]) default: print("central not powered on") } } func centralManager(_ central: CBCentralManager, didDiscover peripheral: CBPeripheral, advertisementData: [String : Any], rssi RSSI: NSNumber) { print("Dicovered \(peripheral)") } } extension ViewController: CBPeripheralManagerDelegate { func peripheralManagerDidUpdateState(_ peripheral: CBPeripheralManager) { print("peripheral state: \(peripheral.state)") if peripheral.state == .poweredOn { print("powered on") let uuid = CBUUID(string: "db9acb1e-1ac4-4f70-b58c-3b3dcea84703") let service = CBMutableService(type: uuid, primary: true) peripheralManager.add(service) peripheralManager.startAdvertising([CBAdvertisementDataServiceUUIDsKey: [uuid], CBAdvertisementDataLocalNameKey: "MyService"]) } } }
2
2
386
Sep ’24
Opening external storage in Finder is not working
I am trying to open external/mounted storage in Finder. I simply just want to show the root directory. I have tried multiple things, none worked. NSWorkspace.shared.open() Does work when opening the user's disk (e.g. Macintosh HD), but when comes to any of the external storages a warning appears: "The application "ABC" does not have permission to open "Folder Name". Now, the app ha been granted all permissions in the preferences (Full Disk Access + Files and Folders). Furthermore, I don't even need or care of accessing this directory from within my app. I want the Finder to open it. I am using com.apple.security.files.user-selected.read-only and testing on MacOS: 14.6.1
5
0
441
Sep ’24
Bluetooth State Restoration: Relaunch app after update?
Hi. Background: We have an app acting as central, keeping a persistent connection to a peripheral body worn sensor, in order to ensure that the sensor behaves as expected. We've managed make a robust system, that stays connected while the peripheral is in range. When the peripheral gets out of range we're fine with the app being suspended, as long as it's revived when the sensor gets back in range. This works fine using the Restoration Identifier This revival works also works fine after a phone restart. Now for the question: It doesn't work when upgrading the app. (tested in testflight) As the App Upgrade case isn't specified in: https://developer.apple.com/library/archive/qa/qa1962/_index.html we'd like to know if this case is supported. If it is supported, we'd be happy if you could provide some additional insights. Best Regards Rasmus Tønnesen UNEEG medical
2
0
191
Sep ’24
App using audio background mode gets suspended
Hi. I am developing an alarm app. My app plays white noise to help users get into sleep properly. To keep white noise playing in the background, my app uses audio background mode. The problem is that my app sometimes gets suspended few hours after getting into background. I cannot find the exact reason for it. Doesn't audio background guarantee that app is not suspended by system? Then how can I handle this issue so that my app can keep white noise playing feature?
3
0
219
Sep ’24
Help with eSIM Entitlement Request for Core Telephony API
Hi everyone, I'm currently developing an app that installs eSIM profiles directly within the app and checks if the device supports eSIM. For this functionality, I understand that I need the eSIM entitlement for the Core Telephony API. I submitted a request for the eSIM entitlement to Apple about three weeks ago, but I haven't received a response yet. Has anyone experienced a similar situation? What would be the best course of action to follow up on this? Any guidance or suggestions would be greatly appreciated. Thank you!
3
0
313
Sep ’24
BLE Scanning stops when phone is in locked mode
I have a solution where my application should scan the Bluetooth enabled Beacon devices in the foreground, background, locked mode for the iphone. Currently we have built the project, which is working fine in foreground mode. We have tried exploring multiple options to enable the feature in background and locked mode. As of now, we have come up with the solution which implements Picture in Picture mode in the application and once the application is minimized. The PIP window opens which solves our problem. But the problem stays with locked mode. The scanning is not working in locked mode. So can we know how the bluetooth scanning will work in background and locked mode. Please also mention the alternative solutions to the problem statement if background and locked mode scanning is not possible. I have attached the project source code for reference. This project is being built for Google, so it is bit urgent. Can I expect a quick response to this query?
4
0
423
Sep ’24