On iOS 14, when we have AVCaptureSession setup wit AVCaptureVideoDataOutput, AVCaptureAudioDataOutput and also RemoteIO unit configured, AVCaptureSession sometimes takes upto 3-4 seconds to start especially when also changing AVAudioSession category from playback to playAndRecord. Is this because iOS14 is overall slow and buggy in the initial release or something specific to AVFoundation?
Post
Replies
Boosts
Views
Activity
I see a weird bug. I have the following code:
		 func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {
		 application.isIdleTimerDisabled = true
						...
However, this has no effect on some iOS devices. This was not the case before when code was built with XCode11.x. Is this a known bug?
Dear AVFoundation Engineers & other AVFoundation developers,
In the context of a multilayer video editing timeline where there are 4 or more layers, I want to know if it is a problem to have just one AVVideoCompositionInstruction for the entire time range of the timeline? The parameter requiredSourceTrackIds will be all the tracks added to AVMutableComposition, containsTweening will be true, etc. Then at any frame time, the custom compositor could consult its own internal data structures and blend video frames of different tracks as required. I want to know if there is something wrong in this approach from the performance perspective, especially on new iOS devices(iPhone 7 or later)?
Ours is a video camera app. Our app displays the following prompt in Microphone Usage permission (Privacy - Microphone Usage Description):"The app needs Microphone Access to record Audio"
But now Apple seems to be rejecting it.
Guideline 5.1.1 - Legal - Privacy - Data Collection and Storage
"We noticed that your app requests the user’s consent to access their microphone but does not clarify the use of the microphone in the applicable purpose string."
Wondering what is not obvious here. A video recorder app needs microphone access to record audio, nothing more, nothing less! How do I fix it? It has been 8 years now and no such concern was raised before.
MTLCommandBuffer present(MTLDrawable, afterMinimumDuration: CFTimeInterval) is not found in XCode 12 GM. The following function of MTLCommandBuffer throws compilation error.
	commandBuffer.present(drawable, afterMinimumDuration: 1.0/Double(self.preferredFramespersecond)
Incorrect argument label in call (have ':afterMinimumDuration:', expected ':atTime:')
What is the fix?
I am converting HEVC video to H264 video using AVFoundation APIs. However I have two problems:
How do I detect the encoder used in the input file using AVFoundation?
How do I calculate the bitrate of the output H264 file that matches the quality of input HEVC file? Need to pass this bitrate to AVAssetWriter compression settings dictionary.
This may sound dumb, but if the app is using 100 CIFilters at different times to video frames during video playback, should all CIFilters be preallocated and initialised, or it is okay to create CIFilters as and when needed? I have always created them statically before video playback begins, but I need to do if there is any difference in performance if create them dynamically as and when needed during video playback.
When using AVCaptureVideoDataOutput/AVCaptureAudioDataOutput & AVAsset writer to record video with cinematic extended video stabilization, the audio lags video upto 1-1.5 seconds and as a result in the video recording, video playback is frozen in the last 1-1.5 seconds. This does not happen when using AVCaptureMovieFileOutput. I want to know if this can be fixed or there is a workaround to synchronize audio/video frames? How does AVCaptureMovieFileOutput handle it?
This is a strange bug I am encountering on iOS 14 and it is very hard to replicate in a stand alone project. I sometimes get the following glitch when my app loads on iOS 14 betas that never happened on previous iOS versions. The attached gif shows the issue.
GIF - https://i.stack.imgur.com/qvaE8.gif
What I have is a subview called topPanel.
@IBOutlet weak var topPanel: UIView!
It is set to white color with alpha 1 in the Storyboard.
Then in viewDidLoad, I set it to clear color as follows:
topPanel.backgroundColor = UIColor.white.withAlphaComponent(0.0)
Next, in other places in code, such as viewDidLayoutSubviews:
> override func viewDidLayoutSubviews() {
		super.viewDidLayoutSubviews()
		
		self.cameraUI = CameraUIType.singleCam
}
}
	 private var topPanelState = TopPanelState.none {
		didSet {
				
				
private var cameraUI = CameraUIType.singleCam {
		didSet {
				
				DispatchQueue.main.async {
						self.topPanelState = .none
				}
				
				
				switch topPanelState {
						
				case .none:
						topPanel.isHidden = false //Setting this to true doesn't cause bug
				 ...
			 }
			}
Any ideas where this glitch could be coming from?
```
I need to know if there is a hard limit to add the number of tracks to AVMutableComposition using the API addMutableTrackWithMediaType? That is, if I want to make a collage of 9 videos, I understand I will need to add 9 tracks to AVMutableComposition and use appropriate AVCompositionInstructions to compose the video. How do I know the maximum number of tracks that can be added simultaneously and it is supported by the system? Or is this number device dependent? I read the number of simultaneous decoders that can be run is 16 in iOS, but still not in official documents.
Is there a way to remove background noise from audio samples coming from microphone(RemoteIO Unit) like Apple does it in AirPods during phone call? Which AudioUnit is most effective to achieve this? Does the Phone app uses voice processing Audio unit rather than RemoteIO?
Dear AVFoundation Engineers, I get this error which is reproducible sometimes while using AVAssetWriter to record in multicamera session.
Error -11800
Localized Description: "The operation could not be completed"
Localized Failure Reason: "An unknown error occurred -16364"
And I checked the timestamps for consistency. Here are the timestamps logged in order before they are appended to AVAssetWriter. I am appending trailing log of last few frames timestamps.
2020-08-12 14:41:29.255527+0530 MyApp[1474:1750168] Frame time 126866.62201475
2020-08-12 14:41:29.282100+0530 MyApp[1474:1750174] Frame time 126866.655337833
2020-08-12 14:41:29.313364+0530 MyApp[1474:1750168] Frame time 126866.688661083
2020-08-12 14:41:29.355942+0530 MyApp[1474:1750231] Frame time 126866.721984166
2020-08-12 14:41:29.387164+0530 MyApp[1474:1750174] Frame time 126866.755307333
2020-08-12 14:41:29.423126+0530 MyApp[1474:1750174] Frame time 126866.788630375
2020-08-12 14:41:29.452698+0530 MyApp[1474:1750172] Frame time 126866.821953791
2020-08-12 14:41:29.490752+0530 MyApp[1474:1750175] Frame time 126866.855276791
2020-08-12 14:41:29.518082+0530 MyApp[1474:1750168] Frame time 126866.888599958
2020-08-12 14:41:29.553148+0530 MyApp[1474:1750175] Frame time 126866.921922916
2020-08-12 14:41:29.584810+0530 MyApp[1474:1750168] Frame time 126866.955246333
2020-08-12 14:41:29.620997+0530 MyApp[1474:1750174] Frame time 126866.988569333
2020-08-12 14:41:29.650584+0530 MyApp[1474:1750168] Frame time 126867.021892291
2020-08-12 14:41:29.684804+0530 MyApp[1474:1750175] Frame time 126867.05521575
2020-08-12 14:41:29.724279+0530 MyApp[1474:1750172] Frame time 126867.08853875
2020-08-12 14:41:29.747264+0530 MyApp[1474:1750175] Frame time 126867.121862208
2020-08-12 14:41:29.781779+0530 MyApp[1474:1750172] Frame time 126867.155184916
2020-08-12 14:41:29.817972+0530 MyApp[1474:1750175] Frame time 126867.18850825
2020-08-12 14:41:29.851997+0530 MyApp[1474:1750172] Frame time 126867.221831458
2020-08-12 14:41:29.885758+0530 MyApp[1474:1750175] Frame time 126867.2551545
2020-08-12 14:41:29.916920+0530 MyApp[1474:1750172] Frame time 126867.288477791
2020-08-12 14:41:29.963321+0530 MyApp[1474:1750172] Frame time 126867.31711775
2020-08-12 14:41:29.992338+0530 MyApp[1474:1750172] Frame time 126867.350480875
2020-08-12 14:41:30.029164+0530 MyApp[1474:1750168] Frame time 126867.383843583
2020-08-12 14:41:30.068155+0530 MyApp[1474:1750175] Frame time 126867.417207541
2020-08-12 14:41:30.095844+0530 MyApp[1474:1750175] Frame time 126867.450570833
2020-08-12 14:41:30.133841+0530 MyApp[1474:1750168] Frame time 126867.483933791
2020-08-12 14:41:30.168765+0530 MyApp[1474:1750175] Frame time 126867.517297208
2020-08-12 14:41:30.198893+0530 MyApp[1474:1750168] Frame time 126867.550660416
2020-08-12 14:41:30.235167+0530 MyApp[1474:1750168] Frame time 126867.584023666
2020-08-12 14:41:30.257920+0530 MyApp[1474:1750168] Frame time 126867.617387208
2020-08-12 14:41:30.259389+0530 MyApp[1474:1750009] Error Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-16364), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x11eb6f2d0 {Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"}}, Info ["NSLocalizedDescription": The operation could not be completed, "NSLocalizedFailureReason": An unknown error occurred (-16364), "NSUnderlyingError": Error Domain=NSOSStatusErrorDomain Code=-16364 "(null)"]
Dear AVFoundation Engineers, I get this error which is reproducible sometimes while using AVAssetWriter to record in multicamera session.
Error -11800
Localized Description: "The operation could not be completed"
Localized Failure Reason: "An unknown error occurred -16364"
This is strange and I can't find out anything on internet. I have integrated Firebase Crashlytics and Analytics in my camera app using AVFoundation. Once the available device space on device runs less than 500 MB, weird things start happening. The moment I debug my app, within minutes the disk space on device starts shooting up until there is no space left on device. On iOS 14 beta 4 device, it caused iOS to crash and stuck in boot loop forever. But today I am able to reproduce the issue even on another device running iOS 13.5 which is strange. I managed to kill the app and started deleting files & apps to clear space. Is anyone aware of this issue? Feels very strange and the behaviour occurs when there is less than 500 MB storage left on device. For the iOS 14 beta, I have filed a bug with Apple. But now seeing the same issue on other device running iOS 13, it feels strange. Is there any way to debug why is disk space going up when app is opened? Although its an issue with iOS which occurs when available disk space is less than 500 MB, I still need to debug who is eating disk space on device in the background?
EDIT: I see I had malloc guard and malloc stack enabled along with zombies in the scheme. Does that cause issues with disk space?
Dear Apple Engineers,
This has happened twice with iPhone 11 Pro running iOS 14 beta 4. I have now filed bug FB8334182 and recorded vides and photos that have been attached. Because once the device enters this state, nothing works. Sysdiagnose fails to capture, iPhone can not be synced to Mac anymore, all apps either crash or freeze. Deleting files from system have no effect. And honestly, it is not clear why iPhone storage is increasing. The app I was debugging started showing increase in disk space in GBs when nothing was saved by the app anywhere. Now the iPhone is stuck in boot loop with Apple logo and the only option is to do a hard reset by installing a fresh copy of iPhone.
There is no option to report in Feedback assistant that issue is with iOS. Had to report issue with UIKit instead!