Posts

Post not yet marked as solved
5 Replies
4.7k Views
That is, will my render callback ever be called after AudioOutputUnitStop() returns?In other words will it be safe to free resources used by the render callback or do I need to add realtime safe communication between the stopping thread and the callback thread?This question is intended for both macOS HAL Output and iOS Remote IO output units.
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
I have a IOUSBHostInterface system extension that works great, however if I leave the device plugged in during a reboot, the usual driver is loaded instead and an un-plug + replug is required to load my driver. How can I fix this? It limits the usefulness of my driver.
Posted Last updated
.
Post not yet marked as solved
12 Replies
3.0k Views
We're invited to write PCI drivers for iPad, but there's no capability for PCI Driver Kit in Xcode and a com.apple.developer.driverkit.transport.pci key in your entitlements does not find its way into your embedded.provisionprofile with Automatically Manage Signing nor can you manually add it in the appID portal, because it's not there. Huh?
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.4k Views
There is no documentation for running, debugging and testing system extensions on iPadOS16. The WWDC 2022 session "Bring your driver to iPad with DriverKit" does not count because (as of beta 2) it is completely unreproducible. This document tells us that to test our system extensions we must disable SIP so it's clearly only for macOS: https://developer.apple.com/documentation/driverkit/debugging_and_testing_system_extensions It would be nice if this document were updated with reproducible instructions for testing system extensions on iPadOS! FB10427776
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.7k Views
You don't seem to be able to provision USB or PCI iPad DriverKit drivers, so I tried the null driver which requires only the driverkit entitlement, which can be automatically managed by the Xcode 14 beta. I can get it to deploy, but when it is enabled or when it matches, I see errors like "job failed to spawn," Error Domain=RBSRequestErrorDomain Code=5 "Launch failed." UserInfo={NSLocalizedFailureReason=Launch failed., NSUnderlyingError=0x105706980 {Error Domain=NSPOSIXErrorDomain Code=1 "Operation not permitted" UserInfo={NSLocalizedDescription=Launchd job spawn failed}}} I wonder why that happens for me but didn't happen in the demo.
Posted Last updated
.
Post not yet marked as solved
5 Replies
1.5k Views
Xcode 14.0 beta (14A5228q) refuses to build a DriverKit extension's installer app unless the dext's bundleID a single . suffix added to the bundling app's bundle ID. The error is error build: Embedded binary's bundle identifier is not prefixed with the parent app's bundle identifier. Is this behaviour intentional? Maybe it makes sense for non-driverkit bundled things, but it breaks existing software, invalidates existing dext bundle IDs and leads to awkward bundleIDs like com.foo.installer.mydriver.
Posted Last updated
.
Post not yet marked as solved
0 Replies
1.1k Views
There is currently no way to "Bring your driver to iPad with DriverKit" because it's impossible to develop a driver because developers can neither deploy to the device nor run there because there is no way to procure development provisioning profiles with the required entitlements for actual hardware (USB, PCI). I bought an M1 iPad just to test this alleged new capability and the only thing it's good for so far is theoretically transferring files quickly or attaching some kind of ethernet dongle. My 1 month no-questions-asked returns window is going to close soon! FB10152280 FB10244398 FB10244627 FB10160665 FB10152361 FB10160793 FB10152210 FB10160567 FB10244046 FB10427776
Posted Last updated
.
Post not yet marked as solved
0 Replies
1k Views
I'm using USBDriverKit to write an audio driver for a High Speed USB device. In an attempt to understand the difference between DriverKit Extension and Kernel Extension latencies, I'm dispatching individual isochronous microframes that for this device each account for a duration 125µs, or 6 samples at 48kHz. I don't yet know what kind of latency I'm actually getting but I was surprised to see a high CPU usage of ~11% on a 512GB M1 mac mini running Big Sur 11.6. That's 8000 IsochIO() calls and 8000 completion callbacks per second. Instruments.app tells me that most of my time (60%) is being spent inside mach_msg, as part of a remote procedure call. Multiple questions occur to me: is this normal? should I expect lower CPU usage? isn't mach_msg blocking? shouldn't CPU usage be low? don't low latency audio folks avoid things like remote procedure calls? is seeking low latency throughput with USBDriverKit futile? does Monterey's AudioDriverKit enable lower latency, or is it a convenience API?
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.1k Views
I know AVAssetDownloadURLSession is for HLS content, but I tried it with remote mp3 assets and it successfully downloaded them, albeit minus the "didLoad timeRange" and "didResolve mediaSelection", and the resulting .movpkgs did indeed contain the audio data.However, when trying to load the packages for offline playback, the asset cache reports isPlayableOffline as false.I tried masquerading the mp3 as audio-only HLS, but during download received the error -16655. Is there a way to make download and offline play of audio-only HLS or non-HLS audio work?
Posted Last updated
.
Post not yet marked as solved
3 Replies
2.3k Views
Hello - I'm having consistency problems with video CMSampleBuffers captured with iOS's ReplayKit2 on iOS 12-13 when used from a Broadcast Extension.Some background information: • broadcast extensions have an extremely tight memory limit: 50MB, even on iPad where screen capture frames are very large • video sample buffers arrive at 30fps in 420YpCbCr8BiPlanarFullRange (yuv is an odd choice for an bgra buffer, but I guess it's for that sweet 4/1.5 memory saving factor) • I talk about copying/accessing the video sample buffers for simplicity - the original intention was to rotate portrait to landscape using vImage for reasons outside this question.I would describe the problem like so: if I try to access the video buffer memory using the CPU, I see inconsistent frames - it looks like tiles of the next frame are appearing in the current frame that I'm trying to copy. Because the data is planar, the effect can be quite pretty, with parts of the next frame's colour appearing on the current frame's hue.Where it gets confusing is that the following uses all result in consistent output 1. passing the sample buffers directly to an AVAssetWriter 2. copying/rotating using the GPU with a few hundred lines of Metal 3. copying/rotating using a CoreImage 3-linerI'm not sure what to make of this. Is there a secret sauce for getting a consistent picture of a CMSampleBuffer's CVPixelBuffer that these 3 methods know about and I don't?Are these three methods somehow GPU-only and non-CPU? That's eyebrow raising for AVAssetWriter, but I can also configure the CIContext to use a software renderer and the result is consistent.So I have 2 work arounds (or 3 if I want to get the app transcode a rotation of the portrait video file), I should be ecstatic, right? but I'm not because the memory requirement s of my current Metal and CoreImage solutions can easily spike and take me over the 50MB limit.I would love to get the CPU/vImage approach working, here's what I've tried already: • locking/unlocking the CVPixelBuffer as readOnly or with no flags (INCONSISTENT) • locking/unlocking the CVPixelBuffer's IOSurface with all combinations of .readOnly and .avoidSync • using the IOSurface seedIDs to discard frames that have been modified while locked (INCONSISTENT - not all modifications are reported this way) • combinations of the above • manually incrementing and decrementing the IOSurface's use count (lock/unlock incref/decref is what I hope CVPixelBuffer lock/unlock does anyway) • copying via "outer" base address and rowbytes of CVPixelBuffer/IOSurface and inner-planar too (INCONSISTENT) • tip-toeing around padding bytes (neither explicitly reading nor writing them) • still looking at vImageBuffer_InitForCopyFromCVPixelBuffer although this seems married to CoreGraphics where I don't think 420 will work, would still like to know if this fn is capable of getting a consistent snapshot of the buffer • trying to set IOSurface purgeability (seems to be already set to non purgeable)I would otherwise like to reduce the memory requirements of my Metal & CoreImage attempts. Things I have tried: • asking the CoreImage context to not cache anything • using software/non-software CIContext • disabling CoreImage colour conversion with render(image:to:bounds:colorSpace:nil)(I just want to copy/rotate dumb components)I haven't yet profiled the Metal version yet. Is there something smaller than half4?If anyone has any suggestions I'm all ears.p.s. some of these OOMs happen as the IOSurfaces get mapped into my address space. Having these shared memory buffers billed to my process seems kinda unfair.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.5k Views
UPDATEI don't want this to necessarily be an hls question, it's about multitrack video, so I could rephrase as how do I tell AVPlayerItemVideoOutput which video track to use from a multitrack local file? e.g. an mp4Do I need to make multiple single track AVCompositions and play them in multiple AVPlayers? That might be fine, even though it's not clear that it would work with a non local version...PREVIOUSLYI have an hls stream with multiple video tracks - the tracks are independent, i.e. not different resolution versions of the same thing.I don't see anyway to choose which one AVPlayerItemVideoOutput will use.Is there a way?
Posted Last updated
.
Post not yet marked as solved
3 Replies
2.1k Views
We build a notarized Developer ID app as part of our CI process and everything works fine. The app is distributed inside a DMG. A customer requested an installer package (.pkg) so we made one of the notarized app. But now I'm assailed by doubts - should we sign the package? Should we notarize it? Why? The installer is very simple, it puts the notarized app in /Applications. Gatekeeper doesn't seem to have anything bad to say about it on Catalina, so case closed? It turns out that the DMG too is unsigned and not notarized and no-one has ever complained. Does this mean we should carry on this way? There seems to be a praxis of "notarizing the outer container", but we distribute two separate containers and I don't understand if there's any advantage to notarizing anything beyond the app.
Posted Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
In a normal macOS .app package, I can get localized text from various frameworks, like AVCaptureDevice.localisedName or NSError's NSLocalizedDescriptionKey, as long as I have added the system's current language as a localization in Xcode. This amounts to an .lproj directory being added inside the app package's Contents/Resources directory.But command line tools don't have any such directory.So how can I get the same behaviour from those frameworks in a command line tool?
Posted Last updated
.
Post not yet marked as solved
0 Replies
714 Views
Hello, I'm getting occasional undocumented errors when appending video buffers to an AVAssetWriter.They're -16357 and -16356. Also sometimes -16364, but that was documented by @bford as being related to invalid timestamps.Can anyone shed some light on these? Are they also related to timestamps?Thanks,RF.
Posted Last updated
.
Post not yet marked as solved
0 Replies
900 Views
When I create a capture session on macOS Catalina with video from the FaceTime HD camera and audio from AirPods the first video CMSampleBuffer presentation timestamp is zero, and the subsequent ones are the more usual hostime type timestamps, which appear to be system uptime.If you use the first frame PTS to start an AVAssetWriter session, then the duration of the first frame in the resulting file is your uptime! e.g. right now mine is 22 hours, but it could easily be weeks.Here's a repro https://github.com/gchilds/airpod-pts-bug-reproA demo: https://www.loom.com/share/c5e6a8a122334bbd92437f87e8491a34Is this a bug? Or is this somehow my fault? We're probably going to ignore these otherwise legitimate seeming frames.
Posted Last updated
.