AVFoundation has serious issued in iOS 16.1 beta 5. None of these issues are seen prior to iOS 16.0 or earlier.
The following code fails regularly when switching between AVCaptureMultiCamSession & AVCaptureSession. It turns out that assetWriter.canApply(outputSettings:) condition is false for no apparent reason.
if assetWriter?.canApply(outputSettings: audioSettings!, forMediaType: AVMediaType.audio) ?? false {
}
I dumped audioSettings dictionary and here it is:
Looks like number of channels in AVAudioSession are 3 and that is the issue. But how did that happen? Probably there is a bug and AVCaptureMultiCamSession teardown and deallocation is causing some issue.
Using AVAssetWriter in AVCaptureMultiCamSession, many times no audio is recorded in the video under same audio settings dictionary dumped above. There is audio track in the video but everything is silent it seems. The same code works perfectly in all other iOS versions. I checked that audio sample buffers are indeed vended during recording but it's very likely they are silent buffers.
Is anyone aware of these issues?
Post
Replies
Boosts
Views
Activity
We are getting this error on iOS 16.1 beta 5 that we never saw before in any of the iOS versions.
[as_client] AVAudioSession_iOS.mm:2374 Failed to set category, error: 'what'
I wonder if there is any known workaround for the same. iOS 16 has been a nightmare and lot of AVFoundation code breaks or becomes unpredictable in behaviour. This is a new issue added in iOS 16.1.
It's very hard to know from AVFoundation error codes what is the exact error. For instance, I see the following message and don't know what error code means?
With a 4 channel audio input, I get error while recording a movie using AVAssetWriter and using LPCM codec. Are 4 audio channels not supported by AVAssetWriterInput? Here are my compression settings:
var aclSize:size_t = 0
var currentChannelLayout:UnsafePointer<AudioChannelLayout>? = nil
/*
* outputAudioFormat = CMSampleBufferGetFormatDescription(sampleBuffer)
* for the latest sample buffer received in captureOutput sampleBufferDelegate
*/
if let outputFormat = outputAudioFormat {
currentChannelLayout = CMAudioFormatDescriptionGetChannelLayout(outputFormat, sizeOut: &aclSize)
}
var currentChannelLayoutData:Data = Data()
if let currentChannelLayout = currentChannelLayout, aclSize > 0 {
currentChannelLayoutData = Data.init(bytes: currentChannelLayout, count: aclSize)
}
let numChannels = AVAudioSession.sharedInstance().inputNumberOfChannels
audioSettings[AVSampleRateKey] = 48000.0
audioSettings[AVFormatIDKey] = kAudioFormatLinearPCM
audioSettings[AVLinearPCMIsBigEndianKey] = false
audioSettings[AVLinearPCMBitDepthKey] = 16
audioSettings[AVNumberOfChannelsKey] = numChannels
audioSettings[AVLinearPCMIsFloatKey] = false
audioSettings[AVLinearPCMIsNonInterleaved] = false
audioSettings[AVChannelLayoutKey] = currentChannelLayoutData
I am unable to understand the sentence "Default is YES for all apps linked on or after iOS 15.4"found in Apple's documentation. What does it actually mean? Does it mean that the default value is YES for app running on iOS 15.4 or later? Or does it mean it is yes if the app is built with iOS 15.4 SDK or later? Or is it something else?
This is another strange issue on iOS 16, and one among the many woes that cause autoRotation problems on the platform. At the app startup time, there is a mismatch between -[UIViewController interfaceOrientation] (now deprecated), -[UIViewController supportedInterfaceOrientations] (which returns .landscapeRight), and -[windowScene interfaceOrientation].
public var windowOrientation: UIInterfaceOrientation {
return view.window?.windowScene?.interfaceOrientation ?? .unknown
}
override func viewDidLayoutSubviews() {
super.viewDidLayoutSubviews()
layoutInterfaceForOrientation(windowOrientation)
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
return .landscapeRight
Putting a breakpoint in viewDidLayoutSubviews reveals that windowOrientation is portrait while interfaceOrientation is .landscapeRight.
Is there anyway this can be fixed or any workaround possible?
I get this error when I run my code on an iOS 14 device using XCode 14.
dyld: Symbol not found: _$s7SwiftUI15GraphicsContextV4fill_4with5styleyAA4PathV_AC7ShadingVAA9FillStyleVtF
Referenced from: ******
Expected in: /System/Library/Frameworks/SwiftUI.framework/SwiftUI
I do not get any errors when I run the same code on an iOS 15 or iOS 16 device.
How do I resolve this issue?
iOS 16 has serious bugs in UIKit without any known workarounds, so much that I had to remove my app from sale. Obviously I am affected and am desperate to know if anyone has found any workaround or if UIKit Engineers can tell me any workarounds.
To summarise, the issue is calling setNeedsUpdateOfSupportedInterfaceOrientations shortly (within 0.5 seconds) after app launch doesn't trigger autorotation. It does trigger autorotation when the device is launched from the debugger but not when the app is launched directly. Maybe the debugger incurs some delay that is perhaps sufficient to trigger autorotation? I have filed FB11516363 but need a workaround desperately so as to get my app back on AppStore.
I am seeing a strange issue with autorotation on iOS 16 that is not seen in other iOS versions. And worse, the issue is NOT seen if I connect device to XCode and debug. It is ONLY seen when I directly launch the app on device once it is installed, and that's the reason I am unable to identify any fix. So here is the summary of the issue.
I disable autorotation in the app till the camera session starts running. Once camera session starts running, I fire a notification to force autorotation of device to current orientation.
var disableAutoRotation: Bool {
if !cameraSessionRunning {
return true
}
return false
}
override var supportedInterfaceOrientations: UIInterfaceOrientationMask {
var orientations:UIInterfaceOrientationMask = .landscapeRight
if !self.disableAutoRotation {
orientations = .all
}
return orientations
}
func cameraSessionStartedRunning(_ session:AVCaptureSession?) {
DispatchQueue.main.asyncAfter(deadline: .now(), execute: {
/*
* HELP::: This code does something only when debug directly from XCode,
* not when directly launching the app on device!!!!
*/
cameraSessionRunning = true
if #available(iOS 16.0, *) {
UIView.performWithoutAnimation {
self.setNeedsUpdateOfSupportedInterfaceOrientations()
}
} else {
// Fallback on earlier versions
UIViewController.attemptRotationToDeviceOrientation()
}
self.layoutInterfaceForOrientation(self.windowOrientation)
})
}
I see in Crashlytics few users are getting this exception when connecting the inputNode to mainMixerNode in AVAudioEngine:
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false:
format.sampleRate == hwFormat.sampleRate
Here is my code:
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
Just want to understand how can this error occur and what is the right fix?
I have the following code to connect inputNode to mainMixerNode of AVAudioEngine:
public func setupAudioEngine() {
self.engine = AVAudioEngine()
let format = engine.inputNode.inputFormat(forBus: 0)
//main mixer node is connected to output node by default
engine.connect(self.engine.inputNode, to: self.engine.mainMixerNode, format: format)
do {
engine.prepare()
try self.engine.start()
}
catch {
print("error couldn't start engine")
}
engineRunning = true
}
But I am seeing a crash in Crashlytics dashboard (which I can't reproduce).
Fatal Exception: com.apple.coreaudio.avfaudio
required condition is false: IsFormatSampleRateAndChannelCountValid(format)
Before calling the function setupAudioEngine I make sure the AVAudioSession category is not playback where mic is not available. The function is called where audio route change notification is handled and I check this condition specifically. Can someone tell me what I am doing wrong?
Fatal Exception: com.apple.coreaudio.avfaudio
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x17048c -[NSException initWithCoder:]
3 AVFAudio 0x9f64 AVAE_RaiseException(NSString*, ...)
4 AVFAudio 0x55738 AVAudioEngineGraph::_Connect(AVAudioNodeImplBase*, AVAudioNodeImplBase*, unsigned int, unsigned int, AVAudioFormat*)
5 AVFAudio 0x5cce0 AVAudioEngineGraph::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
6 AVFAudio 0xdf1a8 AVAudioEngineImpl::Connect(AVAudioNode*, AVAudioNode*, unsigned long, unsigned long, AVAudioFormat*)
7 AVFAudio 0xe0fc8 -[AVAudioEngine connect:to:format:]
8 MyApp 0xa6af8 setupAudioEngine + 701 (MicrophoneOutput.swift:701)
9 MyApp 0xa46f0 handleRouteChange + 378 (MicrophoneOutput.swift:378)
10 MyApp 0xa4f50 @objc MicrophoneOutput.handleRouteChange(note:)
11 CoreFoundation 0x2a834 __CFNOTIFICATIONCENTER_IS_CALLING_OUT_TO_AN_OBSERVER__
12 CoreFoundation 0xc6fd4 ___CFXRegistrationPost_block_invoke
13 CoreFoundation 0x9a1d0 _CFXRegistrationPost
14 CoreFoundation 0x408ac _CFXNotificationPost
15 Foundation 0x1b754 -[NSNotificationCenter postNotificationName:object:userInfo:]
16 AudioSession 0x56f0 (anonymous namespace)::HandleRouteChange(AVAudioSession*, NSDictionary*)
17 AudioSession 0x5cbc invocation function for block in avfaudio::AVAudioSessionPropertyListener(void*, unsigned int, unsigned int, void const*)
18 libdispatch.dylib 0x1e6c _dispatch_call_block_and_release
19 libdispatch.dylib 0x3a30 _dispatch_client_callout
20 libdispatch.dylib 0x11f48 _dispatch_main_queue_drain
21 libdispatch.dylib 0x11b98 _dispatch_main_queue_callback_4CF
22 CoreFoundation 0x51800 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__
23 CoreFoundation 0xb704 __CFRunLoopRun
24 CoreFoundation 0x1ebc8 CFRunLoopRunSpecific
25 GraphicsServices 0x1374 GSEventRunModal
26 UIKitCore 0x514648 -[UIApplication _run]
27 UIKitCore 0x295d90 UIApplicationMain
28 libswiftUIKit.dylib 0x30ecc UIApplicationMain(_:_:_:_:)
29 MyApp 0xc358 main (WhiteBalanceUI.swift)
30 ??? 0x104b1dce4 (Missing)
@AVFoundation engineers, my users am seeing the following crash which I can not reproduce at my end.
*** -[__NSArrayI objectAtIndexedSubscript:]: index 2 beyond bounds [0 .. 1]
Fatal Exception: NSRangeException
0 CoreFoundation 0x99288 __exceptionPreprocess
1 libobjc.A.dylib 0x16744 objc_exception_throw
2 CoreFoundation 0x1a431c -[__NSCFString characterAtIndex:].cold.1
3 CoreFoundation 0x4c96c CFDateFormatterCreateStringWithAbsoluteTime
4 AVFCapture 0x6cad4 -[AVCaptureConnection getAvgAudioLevelForChannel:]
All I am doing is this:
func updateMeters() {
var channelCount = 0
var decibels:[Float] = []
let audioConnection = self.audioConnection
if let audioConnection = audioConnection {
for audioChannel in audioConnection.audioChannels {
decibels.append(audioChannel.averagePowerLevel)
channelCount = channelCount + 1
}
}
What am I doing wrong?
Can someone help me in identifying the source of this crash that I see in crash logs?
I have this code to adjust UITextView when keyboard is about to be displayed.
@objc func keyboardWillShow(notification: NSNotification) {
if let rectValue = notification.userInfo?[UIResponder.keyboardFrameEndUserInfoKey] as? NSValue {
let keyboardSize = rectValue.cgRectValue.size
let contentInsets: UIEdgeInsets = UIEdgeInsets(top: 0, left: 0, bottom: keyboardSize.height - view.safeAreaInsets.bottom, right: 0)
textView.contentInset = contentInsets
textView.scrollIndicatorInsets = contentInsets
}
}
I see setting contentInset automatically creates a scrolling animation (after the keyboard appears fully) that makes the text at the cursor visible. The animation happens automatically and there is no way to disable it. I want to scroll to a given location in the textView sometimes when the keyboard is presented (preferably, animating it along with keyboard appearance). The implicit contentInset animation makes it impossible however. First the keyboard animates, then contentInset animates, and now only can I animate to my chosen location in textView. This is awkward. I am wondering what is the right way to achieve it?
This issue is driving me crazy. I load an NSAttributedString in UITextView and within moments after loading the foregroundColor attribute of text is erased(i.e becomes white) without me doing anything. Here is the code and NSLog dump. How do I debug this I wonder?
class ScriptEditingView: UITextView, UITextViewDelegate {
var defaultFont = UIFont.preferredFont(forTextStyle: .body)
var defaultTextColor = UIColor.white
private func commonInit() {
self.font = UIFont.preferredFont(forTextStyle: .body)
self.allowsEditingTextAttributes = true
self.textColor = defaultTextColor
self.backgroundColor = UIColor.black
self.isOpaque = true
self.isEditable = true
self.isSelectable = true
self.dataDetectorTypes = []
self.showsHorizontalScrollIndicator = false
}
}
And then in my ViewController that contains the UITextView, I have this code:
textView = ScriptEditingView(frame: newTextViewRect, textContainer: nil)
textView.delegate = self
view.addSubview(textView)
textView.allowsEditingTextAttributes = true
let guide = view.safeAreaLayoutGuide
// 5
textView.translatesAutoresizingMaskIntoConstraints = false
NSLayoutConstraint.activate([
textView.leadingAnchor.constraint(equalTo: guide.leadingAnchor),
textView.trailingAnchor.constraint(equalTo: guide.trailingAnchor),
textView.topAnchor.constraint(equalTo: view.topAnchor),
textView.bottomAnchor.constraint(equalTo: view.bottomAnchor)
])
textView.attributedText = attributedString
NSLog("Attributed now")
dumpAttributesOfText()
DispatchQueue.main.asyncAfter(deadline: .now() + 1.0) {
NSLog("Attributes after 1 sec")
self.dumpAttributesOfText()
}
And here is code to dump attributes of text:
private func dumpAttributesOfText() {
textView.attributedText?.enumerateAttributes(in: NSRange(location: 0, length: textView.attributedText!.length), options: .longestEffectiveRangeNotRequired, using: { dictionary, range, stop in
NSLog(" range \(range)")
if let font = dictionary[.font] as? UIFont {
NSLog("Font at range \(range) - \(font.fontName), \(font.pointSize)")
}
if let foregroundColor = dictionary[.foregroundColor] as? UIColor {
NSLog("Foregroundcolor \(foregroundColor) at range \(range)")
}
if let underline = dictionary[.underlineStyle] as? Int {
NSLog("Underline \(underline) at range \(range)")
}
})
}
The logs show this:
2022-07-02 13:16:02.841199+0400 MyApp[12054:922491] Attributed now
2022-07-02 13:16:02.841370+0400 MyApp[12054:922491] range {0, 14}
2022-07-02 13:16:02.841486+0400 MyApp[12054:922491] Font at range {0, 14} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841586+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 14}
2022-07-02 13:16:02.841681+0400 MyApp[12054:922491] range {14, 6}
2022-07-02 13:16:02.841770+0400 MyApp[12054:922491] Font at range {14, 6} - HelveticaNeue, 30.0
2022-07-02 13:16:02.841855+0400 MyApp[12054:922491] Foregroundcolor kCGColorSpaceModelRGB 0.96863 0.80784 0.27451 1 at range {14, 6}
2022-07-02 13:16:03.934816+0400 MyApp[12054:922491] Attributes after 1 sec
2022-07-02 13:16:03.935087+0400 MyApp[12054:922491] range {0, 20}
2022-07-02 13:16:03.935183+0400 MyApp[12054:922491] Font at range {0, 20} - HelveticaNeue, 30.0
2022-07-02 13:16:03.935255+0400 MyApp[12054:922491] Foregroundcolor UIExtendedGrayColorSpace 1 1 at range {0, 20}