Dive into the world of video on Apple platforms, exploring ways to integrate video functionalities within your iOS,iPadOS, macOS, tvOS, visionOS or watchOS app.

Video Documentation

Post

Replies

Boosts

Views

Activity

Why AvAssetWriter adds pts drift when writing fMP4
Hi I'm working on a project that require video frame PTS to be consistent between original video and a transcoded one. It's working fairly well on regular mp4, however if I set preferredOutputSegmentInterval to have generate a fMP4 output, even I specified the initialSegmentStartTime as 0, it always add one frame pts offset to all frames. For example: if I use the code sample provided by Apple: https://developer.apple.com/videos/play/wwdc2020/10011/?time=406, useffprobe -select_streams v:0 -show_entries packet=pts_time -of csv ~/Downloads/fmp4/prog_index.m3u8 to display the pts of the output, it doesn't start from 0, but has some one frame pts offset. I also tried open with MP4Box, it also shows the first frames dts and cts are not start from 0. However, if I use AVAssetReader to read the same output video, and get the PTS from 1st frame, it's returning 0. So I can't use it to calculate the pts difference between 2 videos neither. Can I get some help to understand why there is difference between AVAssetWriter/Reader fMP4's pts and others like ffprobe?
0
0
97
2d
AVPlayer unpredictable range requests on iOS when streaming *.mov file
Hi all, I'm trying to diagnose and resolve an issue with stuttering video playback using the standard AVPlayer. The video in question is a 4K, 39-second file in *.mov format, being played on an iOS device. It's served via a local HTTP server that proxies requests to a backend to fetch and process the content. The project uses end-to-end encrypted storage, which necessitates the proxy for handling data processing. While playback in offline scenarios is smooth, we are encountering issues with smooth playback during streaming. The same video streams smoothly on other platforms using the same connection, so network limitations are not a factor. On iOS, playback is consistently choppy, with pauses every 1-3 seconds. The video does not appear to buffer adequately for smooth playback. One particularly curious aspect is the seemingly random pattern of Content-Range requests made by the AVPlayer when streaming the video. Below is an example of the range requests:
2
1
90
4d
AVSampleBufferDisplayLayerContentLayer memory leaks.
I noticed that AVSampleBufferDisplayLayerContentLayer is not released when the AVSampleBufferDisplayLayer is removed and released. It is possible to reproduce the issue with the simple code: import AVFoundation import UIKit class ViewController: UIViewController { var displayBufferLayer: AVSampleBufferDisplayLayer? override func viewDidLoad() { super.viewDidLoad() let displayBufferLayer = AVSampleBufferDisplayLayer() displayBufferLayer.videoGravity = .resizeAspectFill displayBufferLayer.frame = view.bounds view.layer.insertSublayer(displayBufferLayer, at: 0) self.displayBufferLayer = displayBufferLayer DispatchQueue.main.asyncAfter(deadline: .now() + 1) { self.displayBufferLayer?.flush() self.displayBufferLayer?.removeFromSuperlayer() self.displayBufferLayer = nil } } } In my real project I have mutliple AVSampleBufferDisplayLayer created and removed in different view controllers, this is problematic because the amount of leaked AVSampleBufferDisplayLayerContentLayer keeps increasing. I wonder that maybe I should use a pool of AVSampleBufferDisplayLayer and reuse them, however I'm slightly afraid that this can also lead to strange bugs. Edit: It doesn't cause leaks on iOS 18 device but leaks on iPad Pro, iOS 17.5.1
2
0
179
1w
Debug MediaExtension plugin in system exectutable?
I am developing a macOS 15 MediaExtension plugin to enable additional codecs and container formats in AVFoundation My Plugin is sort of working, but i'd like to debug the XPC process that AVFoundation 'hoists' for me from the calling app (ie - the process hosting my plugin instance that is managing the MESampleBuffer protocol calls for example) Is there a method to configure XCode for interactive attaching to this background process for interactive debugging? Right now I have to use Console + Print which is not fun or productive. Does Apple have a working example of a MediaExtension anywhere? This is an exciting API that is very under-documented. I'm willing to spend a Code Review 'credit' for this, but my issues are not quite focused. Any assistance is highly appreciated!
0
0
160
1w
Does tvOS Support the Multiview Feature?
I've seen the Multiview feature on tvOS that displays a small grid icon when available. However, I only see this functionality in VisionOS using the AVMultiviewManager. Does a different name refer to this feature on tvOS? Relevant Links: https://www.reddit.com/r/appletv/comments/12opy5f/handson_with_the_new_multiview_split_screen/ https://www.pocket-lint.com/how-to-use-multiview-apple-tv/#:~:text=You'll%20see%20a%20grid,running%20at%20the%20same%20time.
1
0
131
1w
AVPlayer takes too much memory when playing AVComposition with multiple videos
I am creating an AVComposition and using it with an AVPlayer. The player works fine and doesn't consume much memory when I do not set playerItem.videoComposition. Here is the code that works without excessive memory usage: func configurePlayer(composition: AVMutableComposition, videoComposition: AVVideoComposition) { player.pause() player.replaceCurrentItem(with: nil) let playerItem = AVPlayerItem(asset: composition) player.play() } However, when I add playerItem.videoComposition = videoComposition, as in the code below, the memory usage becomes excessive: func configurePlayer(composition: AVMutableComposition, videoComposition: AVVideoComposition) { player.pause() player.replaceCurrentItem(with: nil) let playerItem = AVPlayerItem(asset: composition) playerItem.videoComposition = videoComposition player.play() } Issue Details: The memory usage seems to depend on the number of video tracks in the composition, rather than their duration. For instance, two videos of 30 minutes each consume less memory than 40 videos of just 2 seconds each. The excessive memory usage is showing up in the Other Processes section of Xcode's debug panel. For reference, 42 videos, each less than 30 seconds, are using around 1.4 GB of memory. I'm struggling to understand why adding videoComposition causes such high memory consumption, especially since it happens even when no layer instructions are applied. Any insights on how to address this would be greatly appreciated. Before After I initially thought the problem might be due to having too many layer instructions in the video composition, but this doesn't seem to be the case. Even when I set a videoComposition without any layer instructions, the memory consumption remains high.
2
0
122
1w
How to implement Picture-in-Picture (PiP) in Flutter for iOS using LiveKit without a video URL?
I am building a video conferencing app using LiveKit in Flutter and want to implement Picture-in-Picture (PiP) mode on iOS. My goal is to display a view showing the speaker's initials or avatar during PiP mode. I successfully implemented this functionality on Android but am struggling to achieve it on iOS. I am using a MethodChannel to communicate with the native iOS code. Here's the Flutter-side code: import 'package:flutter/foundation.dart'; import 'package:flutter/services.dart'; class PipController { static const _channel = MethodChannel('pip_channel'); static Future<void> startPiP() async { try { await _channel.invokeMethod('enterPiP'); } catch (e) { if (kDebugMode) { print("Error starting PiP: $e"); } } } static Future<void> stopPiP() async { try { await _channel.invokeMethod('exitPiP'); } catch (e) { if (kDebugMode) { print("Error stopping PiP: $e"); } } } } On the iOS side, I am using AVPictureInPictureController. Since it requires an AVPlayerLayer, I had to include a dummy video URL to initialize the AVPlayer. However, this results in the dummy video’s audio playing in the background, but no view is displayed in PiP mode. Here’s my iOS code: import Flutter import UIKit import AVKit @main @objc class AppDelegate: FlutterAppDelegate { var pipController: AVPictureInPictureController? var playerLayer: AVPlayerLayer? override func application( _ application: UIApplication, didFinishLaunchingWithOptions launchOptions: [UIApplication.LaunchOptionsKey: Any]? ) -> Bool { let controller: FlutterViewController = window?.rootViewController as! FlutterViewController let pipChannel = FlutterMethodChannel(name: "pip_channel", binaryMessenger: controller.binaryMessenger) pipChannel.setMethodCallHandler { [weak self] (call: FlutterMethodCall, result: @escaping FlutterResult) in if call.method == "enterPiP" { self?.startPictureInPicture(result: result) } else if call.method == "exitPiP" { self?.stopPictureInPicture(result: result) } else { result(FlutterMethodNotImplemented) } } GeneratedPluginRegistrant.register(with: self) return super.application(application, didFinishLaunchingWithOptions: launchOptions) } private func startPictureInPicture(result: @escaping FlutterResult) { guard AVPictureInPictureController.isPictureInPictureSupported() else { result(FlutterError(code: "UNSUPPORTED", message: "PiP is not supported on this device.", details: nil)) return } // Set up the AVPlayer let player = AVPlayer(url: URL(string: "http://commondatastorage.googleapis.com/gtv-videos-bucket/sample/BigBuckBunny.mp4")!) let playerLayer = AVPlayerLayer(player: player) self.playerLayer = playerLayer // Create a dummy view let dummyView = UIView(frame: CGRect(x: 0, y: 0, width: 1, height: 1)) dummyView.isHidden = true window?.rootViewController?.view.addSubview(dummyView) dummyView.layer.addSublayer(playerLayer) playerLayer.frame = dummyView.bounds // Initialize PiP Controller pipController = AVPictureInPictureController(playerLayer: playerLayer) pipController?.delegate = self // Start playback and PiP player.play() pipController?.startPictureInPicture() print("Picture-in-Picture started") result(nil) } private func stopPictureInPicture(result: @escaping FlutterResult) { guard let pipController = pipController, pipController.isPictureInPictureActive else { result(FlutterError(code: "NOT_ACTIVE", message: "PiP is not currently active.", details: nil)) return } pipController.stopPictureInPicture() playerLayer = nil self.pipController = nil result(nil) } } extension AppDelegate: AVPictureInPictureControllerDelegate { func pictureInPictureControllerDidStartPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { print("PiP started") } func pictureInPictureControllerDidStopPictureInPicture(_ pictureInPictureController: AVPictureInPictureController) { print("PiP stopped") } } Questions: How can I implement PiP mode on iOS without using a video URL (or AVPlayerLayer)? Is there a way to display a custom UIView (like a speaker’s initials or an avatar) in PiP mode instead of requiring a video? Why does PiP not display any view, even though the dummy video URL is playing in the background? I am new to iOS development and would greatly appreciate any guidance or alternative approaches to achieve this functionality. Thank you!
1
0
155
2w
CoreMediaErrorDomain -12035 error when playing a Fairplay-protected HLS stream on iOS 18+ through the Apple lightning AV Adapter
Our iOS/AppleTV video content playback app uses AVPlayer to play HLS video streams and supports both custom and system playback UIs. The Fairplay content key is retrieved using AVContentKeySession. AirPlay is supported too. When the iPhone is connected to a TV through the lightning Apple Digital AV Adapter (A1438), the app is mirrored as expected. Problem: when using an iPhone or iPad on iOS 18.1.1, FairPlay-protected HLS streams are not played and a CoreMediaErrorDomain -12035 error is received by the AVPlayerItem. Also, once the issue has occurred, the mirroring freezes (the TV indefinitely displays the app playback screen) although the app works fine on the iOS device. The content key retrieval works as expected (I can see that 2 content key requests are made by the system by the way, probably one for the local playback and one for the adapter, as when AirPlaying) and the error is thrown after providing the AVContentKeyResponse. Unfortunately, and as far as I know, there is not documentation on CoreMediaErrorDomain errors so I don't know what -12035 means. The issue does not occur: on an iPhone on iOS 17.7 (even with FairPlay-protected HLS streams) when playing DRM-free video content (whatever the iOS version) when using the USB-C AV Adapter (whatever the iOS version) Also worth noting: the issue does not occur with other video playback apps such as Apple TV or Netflix although I don't have any details on the kind of streams these apps play and the way the FairPlay content key is retrieved (if any) so I don't know if it is relevant.
2
0
189
2w
iPhone Video Upload Error - reason unknown
Hi everyone, I'm developing a customization tool in which our customer can upload a mp3 or mp4 file that will be scannable through our AR application. On desktop and Android, this is working perfectly. For some reason however, on iPhone we're unable to load most of the video files. I've checked the clips, and they are .mov/h.264 files which are supported by iPhone. We're currently not sure how we can fix this issue to allow customers that own an iPhone to upload clips to our website. Any tips in the right direction are more than welcome. thanks in advance!
1
0
154
2w
AVPlayer fail
when I played a local video(I downloaded it to the sandbox),KVO the AVPlayerItem status is AVPlayerItemStatusFailed and error is Error Domain=AVFoundationErrorDomain Code=-11800 "这项操作无法完成" UserInfo={NSLocalizedFailureReason=发生未知错误(24), NSLocalizedDescription=这项操作无法完成, NSUnderlyingError=0x3004137e0 {Error Domain=NSPOSIXErrorDomain Code=24 "Too many open files"}} why?
2
0
122
2w
How to save 4K60 ProRes Log Video Internally on iPhone internally?
Hello Apple Engineers, Specific Issue: I am working on a video recording feature in my SwiftUI app, and I am trying to record 4K60 video in ProRes Log format using the iPhone's internal storage. Here's what I have tried so far: I am using AVCaptureSession with AVCaptureMovieFileOutput and configuring the session to support 4K resolution and ProRes codec. The sessionPreset is set to .inputPriority, and the video device is configured with settings such as disabling HDR to prepare for Log. However, when attempting to record 4K60 ProRes video, I get the error: "Capturing 4k60 with ProRes codec on this device is supported only on external storage device." This error seems to imply that 4K60 ProRes recording is restricted to external storage devices. But I am trying to achieve this internally on devices such as the iPhone 15 Pro Max, which has native support for ProRes encoding. Here are my questions: Is it technically possible to record 4K60 ProRes Log video internally on supported iPhones (for example: iPhone 15 Pro Max)? There are some 3rd apps (i.e. Blackmagic 👍🏻) that can save 4K60 ProRes Log video on iPhone internally. If internal saving is supported, what additional configuration is needed for the AVCaptureSession or other technique to bypass this limitation? If anyone has successfully saved 4K60 ProRes Log video on iPhone internal storage, your guidance would be highly appreciated. Thank you for your help!
0
0
195
3w
M3 chip reverse video playback performance
We have developed a simple video player Swift application for macOS, which uses the AVFoundation Framework. A special feature of this app is the ability to play the video backward with speeds like -0.25x, -0.5x, and -1.0x. MP4 video file is played directly from the local file system, video codec is h.264, and audio AAC. Video files are huge, like 10 GB, and a length of 3 hours. Playing video in reverse direction works well on a Macbook Air with M1 or M2 chip. When we run the same app with the same video on a Macbook Air with M3 chip the reverse playback is much worse. Playback might stutter badly, especially in the latter part of the video. This same behavior also happens in Apple's Quicktime video player when playing in the reverse direction with -1x speed. What's even more strange is that at one point of a time, the video playback is totally smooth, but again, after a while, the playback is stuttering. For example, this morning reverse playback worked 100 % smoothly, then I rebooted the Mac and tried again: the result was stuttering. After this the Mac stayed idle for several hours and I tried to reverse play video again: smooth performance! My conclusion: M3 playback works fine if the stars in the sky are aligned correctly. :-) So it's not only our app, but also Quicktime player is having exactly the same behavior. And only with the M3 chip. The same symptom appears with another similar M3 Mac, so it can't be a single fault. At the same time, open-source video player iina can reverse play the video well on the same Mac. All Macs have otherwise identical configuration: 16 GB RAM and macOS 15.1.1. Have you experienced the same problem? Any chance to solve this problem? I really hope that the M4 chip Mac is behaving better here.
3
0
249
3w
Best way to cache a inifinite scroll view of videos
Hi, Im working on a app with a infinite scrollable video similar to Tiktok or instagram reels. I initially thought it would be a good idea to cache videos in the file system but after reading this post it seems like it is not recommended to cache videos on the file system: https://forums.developer.apple.com/forums/thread/649810#:~:text=If%20the%20videos%20can%20be%20reasonably%20cached%20in%20RAM%20then%20we%20would%20recommend%20that.%20Regularly%20caching%20video%20to%20disk%20contributes%20to%20NAND%20wear The reason I am hesitant to cache videos to memory is because this will add up pretty quickly and increase memory pressure for my app. After seeing the amount of documents and data storage that instagram stores, its obvious they are caching videos on the file system. So I was wondering what is the updated best practice for caching for these kind of apps?
2
0
212
3w
AVAssetWriter & AVTimedMetadataGroup in AVMultiCamPiP
I'm trying to add metadata every second during video capture in the Swift sample App "AVMultiCamPiP". A simple string that changes every second with a write function triggered by a Timer. Can't get it to work, no matter how I arrange it, always ends up with the error "Cannot create a new metadata adaptor with an asset writer input that has already started writing". This is the setup section: // Add a metadata input let assetWriterMetaDataInput = AVAssetWriterInput(mediaType: .metadata, outputSettings: nil, sourceFormatHint: AVTimedMetadataGroup().copyFormatDescription()) assetWriterMetaDataInput.expectsMediaDataInRealTime = true assetWriter.add(assetWriterMetaDataInput) self.assetWriterMetaDataInput = assetWriterMetaDataInput This is the timed metadata creation which gets triggered every second: let newNoteMetadataItem = AVMutableMetadataItem() newNoteMetadataItem.value = "Some string" as (NSCopying & NSObjectProtocol)? let metadataItemGroup = AVTimedMetadataGroup.init(items: [newNoteMetadataItem], timeRange: CMTimeRangeMake( start: CMClockGetTime( CMClockGetHostTimeClock() ), duration: CMTime.invalid )) movieRecorder?.recordMetaData(meta: metadataItemGroup) This function is supposed to add the metadata to the track: func recordMetaData(meta: AVTimedMetadataGroup) { guard isRecording, let assetWriter = assetWriter, assetWriter.status == .writing, let input = assetWriterMetaDataInput, input.isReadyForMoreMediaData else { return } let metadataAdaptor = AVAssetWriterInputMetadataAdaptor(assetWriterInput: input) metadataAdaptor.append(meta) } I have an older code example in objc which works OK, but it uses "AVCaptureMetadataInput appendTimedMetadataGroup" and writes to an identifier called "quickTimeMetadataLocationNote". I'd like to do something similar in the above Swift code ... All suggestions are appreciated !
5
0
246
3w
iOS 18.0 and above systems, Control Center Video Effects bug
My app is not a VOIP application. I use devices that support character centering, such as iPad 10 or iPad 13.18. The system is iOS 18.0, 18.1, or 18.1.1. When entering live classes, the "Character Centering" button does not appear in the control center, as shown in the following picture. However, if Voice over IP is selected for Background Modes in the project and the app is run again, it will not be reproduced, even if it is uninstalled or reinstalled. Could you please help me investigate the reason? thank you!
0
0
169
3w
Selecting an appropriate AVCaptureDeviceFormat
My app currently captures video using an AVCaptureSession set with the AVCaptureSessionPreset1920x1080 preset. However, I'd like to update this behavior, such that video can be recorded at a range of different resolutions. There isn't a preset aligning to each desired resolution, so I thought I'd instead directly set the AVCaptureDeviceFormat. For any desired resolution, I would find the format that is closest without going under the desired resolution, and then crop it down as a post-processing step. However, what I've observed is that there can be a range of available formats for a device at each resolution, with various differing settings. Presumably there is logic within AVCaptureSession that selects a reasonable default based on all these different settings, but since I am applying the format directly, I think I don't have a way to make use of that default logic? And it is undocumented? Does this mean that the only way to select a format is to implement a comparison function that considers all different values of all different properties on AVCaptureDeviceFormat, and then sort the formats according to this comparator? If so, what if some new property is added to AVCaptureDeviceFormat in the future? The sort would not take this new property into account, and the function might select a format with some new undesired property. Are there any guarantees about what types for formats will be supported on a device? For example, can I take for granted that a '420v' format will exist at each resolution? If so I could filter the formats down only to those with this setting without risking filtering out all of the supported formats. I suspect I may be missing something obvious. Any help would be greatly appreciated!
3
0
275
Nov ’24