Post

Replies

Boosts

Views

Activity

Looking for USBSerialDriver sample code
I would like to write a driver that supports our custom USB-C connected device, which provides a serial port interface. USBSerialDriverKit looks like the solution I need. Unfortunately, without a decent sample, I'm not sure how to accomplish this. The DriverKit documentation does a good job of telling me what APIs exist but it is very light on semantic information and details about how to use all of these API elements. A function call with five unexplained parameters just is that useful to me. Does anyone have or know of a resource that can help me figure out how to get started?
0
0
166
3w
AVAudioPlayer delegate causing retain cycle.
It appears that AVAudioPlayer is maintaining a strong reference to my containing class. Here is the essential code. Pay attention to the comments. class StethRecording: NSObject, ObservableObject, Identifiable { let player: AVAudioPlayer? let id = UUID() @Published var isPlaying = false @Published var progress = 0.0 init(file: AVAudioFile) throws { player = try AVAudioPlayer(contentsOf: file.url) super.init() // I used to assign the player delegate here. // If I do that, when I delete this object, it // doesn't go away. player!.prepareToPlay() } deinit { // If this object doesn't go away, I leave data. // behind. Something I don't want to do. try? deleteAssociatedAudioFile() } func play() { guard let player else { return } // So now I have to assign the delegate whenever // I start playing. player.delegate = self isPlaying = true player.play() startUpdateTimer() } func stop() { guard let player else { return } player.stop() playbackConcluded() } // MARK: - Private Methods private func playbackConcluded() { isPlaying = false stopUpdateTimer() updateProgress() player!.reset() // I also have to remove the delegate when I // stop, for any reason. player!.delegate = nil player!.prepareToPlay() } } extension StethRecording: AVAudioPlayerDelegate { func audioPlayerDidFinishPlaying(_ player: AVAudioPlayer, successfully flag: Bool) { playbackConcluded() } } This works, but is this approach really necessary? I would expect the AVAudioPlayer to use a weak reference for the delegate. Or, am I doing something else wrong here?
1
0
158
4w
AVAudioFile.processingFormat, only Float32 is allowed?
Here is some code I have to create an AVAudioFile instance based on Int16 samples. let format = AVAudioFormat(commonFormat: .pcmFormatInt16, sampleRate: 44100.0, channels: 2, interleaved: false)! let audioFile = try AVAudioFile(forWriting: outputURL, settings: format.settings) When writing to the file I get the following runtime error, presumably from CoreAudio. CABufferList.h:184 ASSERTION FAILURE [(nBytes <= buf->mDataByteSize) != 0 is false]: I read this as a size mismatch between what is specified in the format used to create the file and the file's own internal processingFormat property, which is read-only. Here is my debugger console output showing the input format I created, along with the resulting AVAudioFile fileFormat and processingFormat properties. (lldb) po format <AVAudioFormat 0x300e553b0: 2 ch, 44100 Hz, Int16, deinterleaved> (lldb) po format.settings ▿ 7 elements ▿ 0 : 2 elements - key : "AVNumberOfChannelsKey" - value : 2 ▿ 1 : 2 elements - key : "AVLinearPCMBitDepthKey" - value : 16 ▿ 2 : 2 elements - key : "AVFormatIDKey" - value : 1819304813 ▿ 3 : 2 elements - key : "AVLinearPCMIsNonInterleaved" - value : 1 ▿ 4 : 2 elements - key : "AVLinearPCMIsBigEndianKey" - value : 0 ▿ 5 : 2 elements - key : "AVLinearPCMIsFloatKey" - value : 0 ▿ 6 : 2 elements - key : "AVSampleRateKey" - value : 44100 (lldb) po audioFile.fileFormat <AVAudioFormat 0x300ea5400: 2 ch, 44100 Hz, Int16, interleaved> (lldb) po audioFile.processingFormat <AVAudioFormat 0x300ea5450: 2 ch, 44100 Hz, Float32, deinterleaved> Please note that the input format I'm using does not match either the audio file fileFormat or processingFormat properties. The file format is interleaved even though I specified de-interleaved. This makes sense to me as working with audio files that are growing is much easier and more efficient with interleaved data. The head-scratcher is the processingFormat. I specified Int16 samples and it is expecting Float32? According to the format settings dictionary, we are specifying the correct key/value pairs. Is this expected behavior? Does Apple always insist on Float32 internally or is this a bug?
0
0
163
4w
USBDriverKit driver not showing up in settings on iPadOS 18
In iPadOS 17.7 my driver shows up in settings just fine. After recompiling with Xcode 16 and installing my app (containing my driver) on iPadOS 18, the app shows up in settings but the driver-enable button is missing from Settings. When I plug-in my custom USB device, the app cannot detect it and I am left with no way to manually enable the driver, as I did in the previous version of iPadOS.
1
0
281
Sep ’24
AVFoundation: Strange error while trying to switch camera formats with the touch of a single button.
I'm getting the following output from my iOS app's debug console, note the error on the last line: Capture format keys: ["600x600@25", "1200x1200@5", "1200x1200@30", "1600x1200@2", "1600x1200@30", "3200x2400@15", "3200x2400@2", "600x600@30"] Start capture session for 1600x1200@30: <AVCaptureSession: 0x303c70190 [AVCaptureSessionPresetPhoto]> Stop capture session: <AVCaptureSession: 0x303c70190 [AVCaptureSessionPresetInputPriority]> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCaptureVideoDataOutput: 0x303edf1e0> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCapturePhotoOutput: 0x303ee3e20> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCaptureVideoPreviewLayer: 0x3030b33c0> Start capture session for 600x600@30: <AVCaptureSession: 0x303c70190 [AVCaptureSessionPresetInputPriority]> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCaptureVideoDataOutput: 0x303edf1e0> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCapturePhotoOutput: 0x303ee3e20> <AVCaptureDeviceInput: 0x303ebb720 [Medwand S3 Camera]>[vide] -> <AVCaptureVideoPreviewLayer: 0x3030b33c0> <<<< FigSharedMemPool >>>> Fig assert: "blkHdr->useCount > 0" at (FigSharedMemPool.c:591) - (err=0) This is in response to trying to switch capture formats between the two key modes that must regularly be used by my application. Below you will find the functions that I use to start and stop capturing frames to my preview layer. I have a UI with three buttons. Off Mode 1 Mode 2 If I tap the Off button in between tapping Mode 1 or Mode 2 all is well; I can do this all day. However, attempt to jump between Mode 1andMode 2` directly I run into issues. I added a layer of software between the UI and the underlying functions so that I could make sure to turn off the Camera before turning it back on in the opposite mode and was surprised to get this output. Can someone at Apple please tell me what is going on here? For the rest of you, if anyone knows the magic incantation to safely switch camera formats, please paste that code here. Thanks. I've included my code below. func start(for deviceFormat: String) { sessionQueue.async { [unowned self] in logger.debug("Start capture session for \(deviceFormat): \(self.captureSession)") do { guard let format = formatDict[deviceFormat] else { throw Error.captureFormatNotFound } captureSession.stopRunning() captureSession.beginConfiguration() // May not be necessary. try captureDevice.lockForConfiguration() // Without this we get an error. captureDevice.activeFormat = format captureDevice.unlockForConfiguration() // Matching function: Necessary. captureSession.commitConfiguration() // Matching function: May not be necessary. captureSession.startRunning() } catch { logger.fault("Failed to start camera: \(error.localizedDescription)") errorPublisher.send(error) } } } func stop() { sessionQueue.async { [unowned self] in logger.debug("Stop capture session: \(self.captureSession)") captureSession.stopRunning() } }
0
0
464
May ’24
Internal Apple Developer team member does not appear in TestFlight. (User is in limbo)
I have a team member who was deleted. I tried to re-add that team member immediately after deletion but that failed. However, portal still recognizes that user as a member of the team. This results in the following behavior. When trying to re-add the user to the team, I get a message saying they are already on the team. No new invite can be sent because I can't complete the add-user operation. In TestFlight, I do not see this user on the list/roster of possible internal testers. Has anyone seen this behavior before and if so, how did you fix it?
1
0
547
May ’24
How does one create a provisioning profile for embedded DEXT for iPhoneOS that is signed with a distribution cert?
I've been developing a solution that has an embedded USB driver. I can build and run my solution just fine but I cannot pass verification for uploading to App Store Correct and TestFlight The problem is that the provisioning profile I am using (for development) does not have the explicit Vendor ID (idVendor) but is using the development value of asterisk "*". I've created a release version of my entitlements file with the proper Vendor ID and I have a distribution certificate for iOS. Further, I've created a provisioning profile for app-store distribution (not development) and imported it via Xcode. When I select this provisioning profile, I get the following errors from Xcode: Xcode 14 and later requires a DriverKit development profile enabled for iOS and macOS. Visit the developer website to create or download a DriverKit profile. Provisioning profile "MyProvisioningProfile - App Store" doesn't match the entitlements file's value for the com.apple.developer.driverkit.transport.usb entitlement. If I create and use a DriverKit profile, The Xcode UI errors go away on the "Signing & Capabilities" page. However, these profiles seem to be for development only. I then get an error, during compilation, telling me that the app and extension have two different signers, one for development (DEXT) and one for distribution (App). To sum up, using a DriverKit profile fails during the build process and using a distribution profile is a non-starter for Xcode. I can't even build. What do I need to do to get this to work?
2
0
707
May ’24
DriverKit: embedded.mobileprofile has the wildcard USB Vendor ID instead of my assigned Vendor ID
I've added my Vendor ID to the appropriate entitlement files but my binary fails validation when trying to upload it to the store for distribution. The embeded.mobileprovision file in the generated archive shows an asterisk instead of my approved Vendor ID. How can I make sure the embedded provisioning file has my Vendor ID?
4
0
907
May ’24
AVCaptureVideoPreviewLayer transforms are not working. Need pan/zoom solution.
I am implementing pan and zoom features for an app using a custom USB camera device, in iPadOS. I am using an update function (shown below) to apply transforms for scale and translation but they are not working. By re-enabling the animation I can see that the scale translation seems to initially take effect but then the image animates back to its original scale. This all happens in a fraction of a second but I can see it. The translation transform seems to have no effect at all. Printing out the value of AVCaptureVideoPreviewLayer.transform before and after does show that my values have been applied. private func updateTransform() { #if false // Disable default animation. CATransaction.begin() CATransaction.setDisableActions(true) defer { CATransaction.commit() } #endif // Apply the transform. logger.debug("\(String(describing: self.videoPreviewLayer.transform))") let transform = CATransform3DIdentity let translate = CATransform3DTranslate(transform, translationX, translationY, 0) let scale = CATransform3DScale(transform, scale, scale, 1) videoPreviewLayer.transform = CATransform3DConcat(translate, scale) logger.debug("\(String(describing: self.videoPreviewLayer.transform))") } My question is this, how can I properly implement pan/zoom for an AVCaptureVideoPreviewLayer? Or even better, if you see a problem with my current approach or understand why the transforms I am applying do not work, please share that information.
0
0
506
May ’24
Core Animation Layer in SwiftUI app does not update unless I rotate the device.
I am trying to create a near real-time drawing of waveform data from within a SwiftUI app. The data is streaming in from the hardware and I've verified that the draw(in ctx: CGContext) override in my custom CALayer is getting called. I have added this custom CALayer class as a sublayer to a UIView instance that I am making available via the UIViewRepresentable protocol. The only time I see updated output from the CALayer is when I rotate the device and layout happens (I assume). How can I force SwiftUI to update every time I render new data in my CALayer? More Info: I'm porting an app from the Windows desktop. Previously, I tried to make this work by simply generating a new UIImage from a CGContext every time I wanted to update the display. I quickly exhausted memory with that technique because a new context is being created every time I call UIGraphicsImageRenderer(size:).image { context in }. What I really wanted was something equivalent to a GDI WritableBitmap. Apparently this animal allows a programmer to continuously update and re-use the contents. I could not figure out how to do this in Swift without dropping down to the old CGBitmapContext stuff written in C and even then I wasn't sure if that would give me a reusable context that I could output in SwiftUI each time I refreshed it. CALayer seemed like the answer. I welcome any feedback on a better way to do what I'm trying to accomplish.
2
0
786
Apr ’24
Settings bundle for DEXT loading app does not display content.
I have an app that loads a DEXT (driver). This app includes a settings bundle that allows me to activate/deactivate the driver. When I issue the API call to activate the driver, iOS switches to the Settings app and displays the page for my DEXT loading application but the switch to enable the driver, which is part of my settings bundle, does not appear. I'm using this API call: OSSystemExtensionRequest.activationRequest(forExtensionWithIdentifier: "driver-id", queue: .main) Here are the contents of my settings bundle Root.plist: ` Here are the contents of my Root.strings file: "Group" = "Group"; "Name" = "Name"; "none given" = "none given"; "Enabled" = "Enabled";
2
0
652
Mar ’24
How to use Swift and AVFoundation to stream/record USB microphone input?
I have a custom USB device that includes a microphone. I can see the microphone on macOS when I plug in the device so I know that it is working with the kernel and AV subsystems. I can enumerate and reference the microphone using AVCaptureDevice but I have not been able to figure out how to use this device reference with AVAudioEngine. I'm trying to accomplish two things with this microphone. I want to stream audio from the microphone and have it rendered to the speakers on my MacBook Pro. I want to capture sound data from the microphone and forward it to a live streaming API. To my mind, from what I've read, I need AVAudioEngine to do this but I'm having trouble determining from the documentation just how to go about it on macOS. It seems that there is a lot more information for iOS or iPadOS but since USB-C support is sparsely documented on those operating systems, I'm focusing on the desktop (macOS) for now. Can I convert an AVCaptureDevice into and audio input for AVAudioEngine? If not, how can I accomplish what I'm trying to do using whatever is available on AVFoundation?
1
0
1.2k
Jan ’24
Unable to link to or even find USBDriverKit for iPadOS
According to Apple's documentation USBDriverKit for iPadOS has been available since DriverKit 19 and the M1 iPad were released. However, using Xcode 15.0.1, I am unable to link to or even find USBDriverKit from within Xcode. Is the documentation incorrect? If not what do I need to do in order to access and leverage USBDriverKit? From what I've read, USBSerialDriverKit is not available. As I have a USB based serial device that I want to use from the iPad, the next obvious step is to write my own USB device extension using USBDriverKit as a provider. If you have experience with this and can definitively say whether or not what I'm attempting is even within the realm of possibility, please chime in. Thanks. -Michael
2
0
989
Dec ’23