Posts

Post not yet marked as solved
2 Replies
1.2k Views
Is it possible to have horizontal scrolling 2D UICollectionView with cells which are variable in width but same in height without subclassing UICollectionViewLayout? Some items may be even more than the size of collectionView frame and the size of any two items is not the same in general and the gap between successive items may also be variable. WWDC videos say compositional layout can achieve anything we can imagine, not sure how to do this.
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
1 Replies
447 Views
Dear Apple Engineers, There seems to incomplete documentation about new AVFoundation methods introduced in iOS 14  @available(iOS 13.0, *)   optional func anticipateRendering(using renderHint: AVVideoCompositionRenderHint) @available(iOS 13.0, *) optional func prerollForRendering(using renderHint: AVVideoCompositionRenderHint) How do we exactly use these methods, when are they exactly called, how much time can we take to load texture from memory in these methods, what is the correct usage...can anyone explain?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
When in multi-selction mode (selectionLimit = 0), the PHPickerViewController almost everytime detects a scroll or swipe gesture as touch and selects the entire row of photos. It is so buggy that the user does not know which photos he inadvertently selected! While it is easy to use third party frameworks for this job, the biggest advantage of this picker is it doesn't need complex Photo Library permissions that are confusing users on iOS 14. Are Apple Engineers even aware of this serious bug that makes the picker almost useless for multi-selection mode?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
972 Views
PHPickerViewController allows access to copies of photo library assets as well as returning PHAssets in the results. To get PHAssets instead of file copies, I do: let photolibrary = PHPhotoLibrary.shared() var configuration = PHPickerConfiguration(photoLibrary: photolibrary) configuration.filter = .videos configuration.selectionLimit = 0 let picker = PHPickerViewController(configuration: configuration) picker.delegate = self self.present(picker, animated: true, completion: nil) And then, 			func picker(_ picker: PHPickerViewController,		 didFinishPicking results: [PHPickerResult]) { picker.dismiss(animated: true) { let identifiers:[String] = results.compactMap(\.assetIdentifier) let fetchResult = PHAsset.fetchAssets(withLocalIdentifiers: identifiers, options: nil) NSLog("\(identifiers), \(fetchResult)") } } But the problem is once the photo picker is dismissed, it prompts for Photo Library access which is confusing and since the user anyways implicitly gave access to the selected assets in PHPickerViewController, PHPhotoLibrary should load those assets directly. Is there anyway to avoid the Photo library permission? The other option to copy the assets in the app is a waste of space for editing applications.
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
415 Views
I am using the following code to prompt PhotoLibrary access with add only prompt: PHPhotoLibrary.requestAuthorization(for: .addOnly) { status in handler(status) } However, this still shows the wrong prompt "App would Like to Access Your Photos". Why is that and what can I do to show "App would like to Add to Your Photos" as shown in WWDC video?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
311 Views
PHPhotoLibrary gives an AVComposition rather than an AVURLAsset for a video recorded in SlowMo. I want to insert slowMo videos into another AVMutableComposition, so this means I need to insert this AVComposition into AVMutableComposition which is my editing timeline. The hack I used before was to load the tracks and segments and find the mediaURL of asset. AVCompositionTrack *track = [avAsset tracks][0]; AVCompositionTrackSegment *segment = track.segments[0]; mediaURL = [segment sourceURL]; Once I had mediaURL, I was able to create a new AVAsset that could be inserted into AVMutableComposition. But I wonder if there is a cleaner approach that allows the slowMo video composition to be directly inserted into the timeline AVMutableComposition?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
3 Replies
397 Views
Dear AVFoundation Engineers & other AVFoundation developers, In the context of a multilayer video editing timeline where there are 4 or more layers, I want to know if it is a problem to have just one AVVideoCompositionInstruction for the entire time range of the timeline? The parameter requiredSourceTrackIds will be all the tracks added to AVMutableComposition, containsTweening will be true, etc. Then at any frame time, the custom compositor could consult its own internal data structures and blend video frames of different tracks as required. I want to know if there is something wrong in this approach from the performance perspective, especially on new iOS devices(iPhone 7 or later)?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
255 Views
I see a weird bug. I have the following code: 		  func application(_ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]?) -> Bool {      		   application.isIdleTimerDisabled = true 						... However, this has no effect on some iOS devices. This was not the case before when code was built with XCode11.x. Is this a known bug?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
2 Replies
1.8k Views
Ours is a video camera app. Our app displays the following prompt in Microphone Usage permission (Privacy - Microphone Usage Description):"The app needs Microphone Access to record Audio" But now Apple seems to be rejecting it. Guideline 5.1.1 - Legal - Privacy - Data Collection and Storage "We noticed that your app requests the user’s consent to access their microphone but does not clarify the use of the microphone in the applicable purpose string." Wondering what is not obvious here. A video recorder app needs microphone access to record audio, nothing more, nothing less! How do I fix it? It has been 8 years now and no such concern was raised before.
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
274 Views
MTLCommandBuffer present(MTLDrawable, afterMinimumDuration: CFTimeInterval) is not found in XCode 12 GM. The following function of MTLCommandBuffer throws compilation error. 	commandBuffer.present(drawable, afterMinimumDuration: 1.0/Double(self.preferredFramespersecond) Incorrect argument label in call (have ':afterMinimumDuration:', expected ':atTime:') What is the fix?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
0 Replies
595 Views
I am converting HEVC video to H264 video using AVFoundation APIs. However I have two problems: How do I detect the encoder used in the input file using AVFoundation? How do I calculate the bitrate of the output H264 file that matches the quality of input HEVC file? Need to pass this bitrate to AVAssetWriter compression settings dictionary.
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
12 Replies
11k Views
iOS 13/13.1 autorotation seems to be behave differently than iOS 12. For instance, my app allows user to lock interface orientation to portrait or landscape mode in settings.If I have portrait rotation lock on device and return only .landscape mode as supportedInterfaceOrientations, the interface remains in portrait mode only until I disable portrait lock orientation on device. This does not seem to be the case with iOS 12. Infact, supportedInterfaceOrientations is not even called!UIViewController.attemptRotationToDeviceOrientation() also does not work in such cases.The root of the problem is I temporarily return shouldAutorotate to false while the app is initializing and when everything is initialized, I call UIViewController.attemptRotationToDeviceOrientation() to trigger autorotation. It triggers autorotation in iOS 12 but in iOS 13.1 it doesn't works.Looks like a bug in iOS 13.1 I believe. What's a known workaround?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
1 Replies
333 Views
This may sound dumb, but if the app is using 100 CIFilters at different times to video frames during video playback, should all CIFilters be preallocated and initialised, or it is okay to create CIFilters as and when needed? I have always created them statically before video playback begins, but I need to do if there is any difference in performance if create them dynamically as and when needed during video playback.
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
2 Replies
607 Views
Please help, I was trying to fill up disk space of my iPhone 11 pro by video recordings and then I found all apps won't open up. I rebooted the device and now iOS is stuck in boot loop. How do I break open this boot chain?
Posted
by Devgeek.
Last updated
.
Post not yet marked as solved
4 Replies
1.1k Views
This is strange and I can't find out anything on internet. I have integrated Firebase Crashlytics and Analytics in my camera app using AVFoundation. Once the available device space on device runs less than 500 MB, weird things start happening. The moment I debug my app, within minutes the disk space on device starts shooting up until there is no space left on device. On iOS 14 beta 4 device, it caused iOS to crash and stuck in boot loop forever. But today I am able to reproduce the issue even on another device running iOS 13.5 which is strange. I managed to kill the app and started deleting files & apps to clear space. Is anyone aware of this issue? Feels very strange and the behaviour occurs when there is less than 500 MB storage left on device. For the iOS 14 beta, I have filed a bug with Apple. But now seeing the same issue on other device running iOS 13, it feels strange. Is there any way to debug why is disk space going up when app is opened? Although its an issue with iOS which occurs when available disk space is less than 500 MB, I still need to debug who is eating disk space on device in the background? EDIT: I see I had malloc guard and malloc stack enabled along with zombies in the scheme. Does that cause issues with disk space?
Posted
by Devgeek.
Last updated
.