Yes it does (assuming I'm correct about AVCaptureDevice being part of the AVFoundation framework). I have a USB connected camera that I use to capture video and still images using the AVCaptureDevice APIs.
Post
Replies
Boosts
Views
Activity
I am also having an issue with this change in my code. Why doesn't switching from .ascii to .utf8 work? Doesn't this allow for the high bit to be set? Sorry for the ignorance but I look forward to Quinn's answer.
This is now an issue with iPadOS 18. In iPadOS 17.7 my driver shows up in settings just fine. After recompiling with Xcode 16 and installing my app (containing my driver) on iPadOS 18, the app shows up in settings but the driver-enable button is missing from Settings. When I plug-in my custom USB device, the app cannot detect it.
Did you ever figure this out? I've been doing the same thing. I don't get any crashes but when I hand the buffers of to LiveSwitch for playback, there is not audio signal.
I receive the buffer from a tap on bus zero.
I send the buffer to the publisher.
Buffer is received, potentially by up to two consumers (currently one).
Buffer has to be converted using AVAudioConverter from Float32 to Int16, which is required for consumption by LiveSwitch APIs.
Buffer memory converted to NSMutableData (required by LiveSwitch)
Buffer wrapped / converted in FMLiveSwitchAudioFrame.
Buffer raised to LiveSwitch for processing.
Result: No signal.
Can Apple weigh-in on this issue? I'm currently receiving samples as Float values and I also need signed 16-bit integer values. I don't want to convert these on the fly if I don't have to or if I can somehow get it for free at the source. Apple: How do we make this happen in code?
@enodev Thank you for your response. However, the answer seems to skirt the question. I don't want to use both Actor and DispatchQueue, perhaps in some sort of wrapped or nested form. I want to use one in place of the other. Your answer would seem to imply that this is not possible and that using an Actor with AVFoundation APIs that must run on something other than main, for blocking reasons, would still need a DispatchQueue.
Solved. Must call setNeedsDisplay from the main thread.
It looks like the main culprit here is a lack of patience on our part. Another way of putting it is that the portal is slow when handling some operations and doesn't do the best job of reporting current status. In any case, coming back to the problem the next day showed the user and the portal in their appropriate states. We are able to continue with our test.
This problem has been solved. Please see this thread for details.
https://developer.apple.com/forums/thread/751490
Kevin, thank you so much for responding. My entire dev team just got together on a video call and spent 3 hours trying everything we could to figure this out. No personal reflection on you but Apple MUST do better. The information you provided above jives with most of what we discovered. The solution for us was to switch to automatic and also change the idVendor value from a String to a Number when moving from an wildcard value to a literal value. This appears to be necessary when putting a real Vendor ID in the entitlements file. This was a lot harder than it should have been.
I have the same question with regard to the USB transport entitlement. I've added my vendor id to the appropriate entitlement files but my binary fails validation when trying to upload it to the store for distribution. The embeded.mobileprovision file in the generated archive shows an asterisk instead of my approved Vendor ID. How can I make sure the embedded provisioning file has my Vendor ID?
Solved. Must call setNeedsDisplay from the main thread.
The iPad I'm trying to use is not an M1 or M2 iPad, which is a requirement for the DriverKit extensions. I have the generation just prior to the M1 iPad.
Ignore the API call I listed in the initial post. That is for macOS not iPadOS. I actually support both platforms but the issue I'm having is with the iPad. There we simply launch the settings app using the following technique:
extension ExtensionManager {
func activate() {
guard let url = URL(string: UIApplication.openSettingsURLString) else {
return
}
UIApplication.shared.open(url)
}
func deactivate() {
}
}
Trial & error coupled with scouring the web has yielded at least a partial answer:
import AVFoundation
import Cocoa
import PlaygroundSupport
PlaygroundPage.current.needsIndefiniteExecution = true
let engine = AVAudioEngine()
let inputNode = engine.inputNode
let format = inputNode.inputFormat(forBus: 0)
let outputNode = engine.outputNode
engine.connect(inputNode, to: outputNode, format: format)
//var bufferCount = 0
//input.installTap(onBus: 0, bufferSize: 2048, format: nil) { audioBuffer, audioTime in
// bufferCount += 1
// print(bufferCount)
//}
engine.prepare()
do {
try engine.start()
} catch {
print(error)
}
Ridiculously simple but amazingly difficult to stumble upon. At least, that was my experience. Next, I need a way to enumerate the inputs so I can pick the microphone I want to use. Anyone have a clue on how that is accomplished on macOS? During my time researching this problem, I noticed that things are much easier and much better documented for iOS.