Create view-level services for media playback, complete with user controls, chapter navigation, and support for subtitles and closed captioning using AVKit.

AVKit Documentation

Posts under AVKit tag

89 Posts
Sort by:
Post not yet marked as solved
0 Replies
422 Views
Hi, We were using Capture Systems of AVKit to take photo's in our app and we need to zoom camera to certain limit. If we configure zoomFactor to AVCaptureDevice we receiving awkward VideoFrames(blurred images) through Camera. Our app works fine in all devices of iPhone/iPad except devices that support Center Stage. We looked into Apple's default Camera app we understood that it was implemented using UIImagePickerController. We tried with multiple combinations of AVCaptureDevice.Format/AVCaptureSession.Preset but nothing helped us. We want's to achieve zoom(front camera) through AVKit, we'll add code snippet we used below please help on this. session.sessionPreset = AVCaptureSession.Preset.photo var bestFormat: AVCaptureDevice.Format? var bestFrameRateRange: AVFrameRateRange? for format in device.formats { for range in format.videoSupportedFrameRateRanges { if range.maxFrameRate > bestFrameRateRange?.maxFrameRate ?? 0 { bestFormat = format bestFrameRateRange = range } } } if let bestFormat = bestFormat, let bestFrameRateRange = bestFrameRateRange { do { try device.lockForConfiguration() // Set the device's active format. device.activeFormat = bestFormat // Set the device's min/max frame duration. let duration = bestFrameRateRange.minFrameDuration device.activeVideoMinFrameDuration = duration device.activeVideoMaxFrameDuration = duration device.videoZoomFactor = 2.0 device.unlockForConfiguration() } catch { // Handle error. } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
448 Views
Hey Apple! I'm just wondering if there are any recommendations on best practices for supporting AV experiences in SwiftUI? As far as I know, VideoPlayer is the only API available directly supported in SwiftUI in AVKit (https://developer.apple.com/documentation/avkit) without the need for UIViewRepresentable / UIViewControllerRepresentable bridging of AVPlayer into SwiftUI. However there are many core video and audio experiences that a modern audience expect that are not supported in VideoPlayer. e.g. PiP Is there a roadmap for support in SwiftUI directly? Thanks!
Posted
by em_walks.
Last updated
.
Post not yet marked as solved
0 Replies
507 Views
In the app, I have VOIP functionality along with AVPlayer for playing videos from remote URLs. Once a VOIP call is established, AVPlayer gets AVPlayerRateDidChangeReasonSetRateFailed right after AVPlayerRateDidChangeReasonSetRateCalled in AVPlayer.rateDidChangeNotification observer when trying to start a video using the play() method. As a result, the video does not start. Checked AVAudioSession.interruptionNotification, it is not getting fired. AVPlayer functionality works as expected before and after the call. Issue observable on iOS 17 only. Any help would be appreciated.
Posted Last updated
.
Post not yet marked as solved
0 Replies
380 Views
I have designed an app for media player, in this app i need to implement live tv, movies and series. so url can be of any type such as .ts formate for live tv, and .mp4, .mov, etc. I am also going to work with m3u. but AVPlayer does not supports all these urls.So can i get some suggestions and solutions for that. what could be the best practice and how to work with all these kind if urls etc.
Posted Last updated
.
Post marked as solved
1 Replies
1.2k Views
I'm using SwiftUI to play videos. I'm trying to use a NavigationStack to show a list of videos, then when i click on on it goes to a new view to play that video. I see lots of example that have the underlying AVPlayer in a @State or @StateObject, but as soon as I reassign it to a new AVPlayer the VideoPlayer stops working; I just get a black screen. Is there a better way to change the url that VideoPlayer is playing? I'm trying to avoid creating a bunch of VideoPlayer objects ahead of time and having them all in memory, as this might have a large number of videos. More details: App is TVOS 17, using Xcode 15.
Posted
by EKLynx.
Last updated
.
Post marked as solved
4 Replies
6.2k Views
Hello there, in our team we were requested to add the possibility to manually select the video quality. I know that HLS is an adaptive stream and that depending on the network condition it choose the best quality that fits to the current situation. I also tried some setting with preferredMaximumResolution and preferredPeakBitRate but none of them worked once the user was watching the steam. I also tried something like replacing the currentPlayerItem with the new configuration but anyway this only allowed me to downgrade the quality of the video. When I wanted to set it for example to 4k it did not change to that track event if I set a very high values to both params mentioned above. My question is if there is any method which would allow me to force certain quality from the manifest file. I already have some kind of extraction which can parse the manifest file and provide me all the available information but I couldn't still figure out how to make the player reproduce specific stream with my desired quality from the available playlist.
Posted Last updated
.
Post not yet marked as solved
0 Replies
402 Views
displayManager.isDisplayCriteriaMatchingEnabled returns true if one (or both) between refresh rate, and dynamic range, is set to match the content in the AppleTV settings. There's no way to make a distinction between them, and only enable one of them accordingly. Looks like Apple failed to change their APIs to me. What am I missing?
Posted
by iijjj.
Last updated
.
Post not yet marked as solved
2 Replies
877 Views
Hello 👋 I try to implement picture in picture on iOS with webRTC but I have some issue. I started by following this Apple article : https://developer.apple.com/documentation/avkit/adopting_picture_in_picture_for_video_calls At least when my app is in background, the picture in picture view appear, but nothing is display within it : So by searching on internet I found this post in Stackoverflow (https://stackoverflow.com/questions/71419635/how-to-add-picture-in-picture-pip-for-webrtc-video-calls-in-ios-swift), who says : It's interesting but unfortunately, I don't know what I have to do... Here is my PictureInPictureManager : final class VideoBufferView: UIView { override class var layerClass: AnyClass { AVSampleBufferDisplayLayer.self } var sampleBufferDisplayLayer: AVSampleBufferDisplayLayer { layer as! AVSampleBufferDisplayLayer } } final class PictureInPictureManager: NSObject { static let shared: PictureInPictureManager = .init() private override init() { } private var pipController: AVPictureInPictureController? private var bufferView: VideoBufferView = .init() func configure(for videoView: UIView) { if AVPictureInPictureController.isPictureInPictureSupported() { let bufferView: VideoBufferView = .init() let pipVideoCallViewController: AVPictureInPictureVideoCallViewController = .init() pipVideoCallViewController.preferredContentSize = CGSize(width: 108, height: 192) pipVideoCallViewController.view.addSubview(bufferView) let pipContentSource: AVPictureInPictureController.ContentSource = .init( activeVideoCallSourceView: videoView, contentViewController: pipVideoCallViewController ) pipController = .init(contentSource: pipContentSource) pipController?.canStartPictureInPictureAutomaticallyFromInline = true pipController?.delegate = self } else { print("❌ PIP not supported...") } } } With this code, the picture in picture view appear empty. I read multiple article who talk about using the buffer but I'm not sure how to do it with webRTC... I tried by adding this function to my PictureInPictureManager : func updateBuffer(with pixelBuffer: CVPixelBuffer) { if let sampleBuffer = createSampleBufferFrom(pixelBuffer: pixelBuffer) { bufferView.sampleBufferDisplayLayer.enqueue(sampleBuffer) } else { print("❌ Sample buffer error...") } } private func createSampleBufferFrom(pixelBuffer: CVPixelBuffer) -> CMSampleBuffer? { var presentationTime = CMSampleTimingInfo() // Create a format description for the pixel buffer var formatDescription: CMVideoFormatDescription? let formatDescriptionError = CMVideoFormatDescriptionCreateForImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, formatDescriptionOut: &formatDescription ) guard formatDescriptionError == noErr else { print("❌ Error creating format description: \(formatDescriptionError)") return nil } // Create a sample buffer var sampleBuffer: CMSampleBuffer? let sampleBufferError = CMSampleBufferCreateReadyWithImageBuffer( allocator: kCFAllocatorDefault, imageBuffer: pixelBuffer, formatDescription: formatDescription!, sampleTiming: &presentationTime, sampleBufferOut: &sampleBuffer ) guard sampleBufferError == noErr else { print("❌ Error creating sample buffer: \(sampleBufferError)") return nil } return sampleBuffer } but by doing that, I get this error message : Any help is welcome ! 🙏 Thanks, Alexandre
Posted Last updated
.
Post marked as solved
3 Replies
2k Views
I have a Catalyst application that uses (as expected) MPNowPlayingInfoCenter to set the now playing info and MPRemoteCommandCenter to get the media events for play/pause/stop/favorite/etc. The code is shared on iOS, tvOS and watchOS and it works correctly there. It seems not to work on macOS (app is compiled as a Catalyst application) on Big Sur (and Monterey, fwiw). Media keys on the keyboard starts the Music app, the music part of the control center do not show now playing info (nor the media controls there send messages to the app). I seem to remember that it used to work in Catalina (at least the media key part) and for sure it used to work in a precedent version of the same app that used to be an UIKit one. Is this a bug (worth a feedback to Apple) or something wrong on my side? I forgot some magic capability for macOS? App is sandboxed and uses hardened runtime, in case this is significant. Thank you for any hint!
Posted
by gtufano.
Last updated
.
Post not yet marked as solved
0 Replies
615 Views
In voip application , when the CallKit is enabled if we try playing a video through AVplayer the video content is updated frame by frame and the audio of the content is not audible . This issue is observed only in iOS 17, any idea how can we resolve this
Posted
by dhilipr.
Last updated
.
Post not yet marked as solved
2 Replies
1.2k Views
We have QR-scanner feature implemented on web view (WKWebView). If it's dark we want to light our QR-code using flashlight in iPhone. In general, this feature works, but without flashlight. But have that problems: If we turn on torch then camera preview disappears. If turn off torch then camera preview appears. Do you have any idea why it's so? And how can we sort it out? Thanks
Posted
by Taras_C.
Last updated
.
Post not yet marked as solved
0 Replies
376 Views
I have a Mac Catalyst app configured like so: The root view controller on the window is a tripe split UISplitViewController. The secondary view controller in the Split View controller is a view controller that uses WKWebView. Load a website in the WKWebview that has a video. Expand the video to “Full screen” (on Mac Catalyst this is only “Full window” because the window does not enter full screen like AppKit apps do). The NSToolbar overlaps the “Full screen video.” On a triple Split View controller only the portions of the toolbar in the secondary and supplementary columns show through (the video actually covers the toolbar area in the “primary” column). The expected results: -For the video to cover the entire window including the NSToolbar. Actual results: The NSToolbar draw on top of the video. -- Anyone know of a workaround? I filed FB13229032
Posted Last updated
.
Post not yet marked as solved
3 Replies
1.2k Views
When trying to present a AVPlayerViewController I am getting this error: -AVSystemController- +[AVSystemController sharedInstance]: Failed to allocate AVSystemController, numberOfAttempts=1 and when setting the AVPlayer to it, I get <<<< AVError >>>> AVLocalizedErrorWithUnderlyingOSStatus: Returning error (AVFoundationErrorDomain / -11,800) status (-12,746) Nothing of this happens with iOS 16 or lower
Posted Last updated
.
Post not yet marked as solved
1 Replies
434 Views
I want to show the user actual start and end dates of the video played on the AVPlayer time slider, instead of the video duration data. I would like to show something like this: 09:00:00 ... 12:00:00 (which indicates that the video started at 09:00:00 CET and ended at 12:00:00 CET), instead of: 00:00:00 ... 02:59:59. I would appreciate any pointers to this direction.
Posted
by tomas.bek.
Last updated
.
Post not yet marked as solved
0 Replies
319 Views
Hi, We have a tvOS App with a custom player and we're getting some crashes trying to remove a periodicTimeObserver on a player instance: Incident Identifier: 3FE68C1C-126D-4A16-BBF2-9F8D1E395548 Hardware Model: AppleTV6,2 Process: MyApp [2516] Path: /private/var/containers/Bundle/Application/B99FEAB0-0753-48FE-A7FC-7AEB8E2361C1/MyApp.app/MyApp Identifier: pt.appletv.bundleid Version: 4.9.5 (2559) AppStoreTools: 15A240a AppVariant: 1:AppleTV6,2:16 Beta: YES Code Type: ARM-64 (Native) Role: Foreground Parent Process: launchd [1] Coalition: pt.appletv.bundleid [317] Date/Time: 2023-09-21 18:49:39.0241 +0100 Launch Time: 2023-09-21 18:38:34.6957 +0100 OS Version: Apple TVOS 16.6 (20M73) Release Type: User Report Version: 104 Exception Type: EXC_CRASH (SIGABRT) Exception Codes: 0x0000000000000000, 0x0000000000000000 Termination Reason: SIGNAL 6 Abort trap: 6 Terminating Process: MyApp [2516] Triggered by Thread: 0 Last Exception Backtrace: 0 CoreFoundation 0x1914c12c8 __exceptionPreprocess + 160 (NSException.m:202) 1 libobjc.A.dylib 0x190cfc114 objc_exception_throw + 56 (objc-exception.mm:356) 2 AVFCore 0x1c432b89c -[AVPlayer removeTimeObserver:] + 176 (AVPlayer.m:0) 3 CustomPlayer 0x10549f670 MyPlayerViewController.removePlayerObservers(_:) + 248 (MyPlayerViewController.swift:252) 4 CustomPlayer 0x10549c978 closure #1 in MyPlayerViewController.player.didset + 68 (MyPlayerViewController.swift:98) 5 CustomPlayer 0x10549be60 thunk for @escaping @callee_guaranteed () -> () + 28 (<compiler-generated>:0) 6 libdispatch.dylib 0x190e5eef4 _dispatch_call_block_and_release + 24 (init.c:1518) 7 libdispatch.dylib 0x190e60784 _dispatch_client_callout + 16 (object.m:560) 8 libdispatch.dylib 0x190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 9 libdispatch.dylib 0x190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 10 CoreFoundation 0x19142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 11 CoreFoundation 0x19142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 12 CoreFoundation 0x191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 13 GraphicsServices 0x1980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 14 UIKitCore 0x1da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 15 UIKitCore 0x1da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 16 MyApp 0x104418268 main + 176 (main.swift:12) 17 dyld 0x1ddd81744 start + 1832 (dyldMain.cpp:1165) Thread 0 name: Thread 0 Crashed: 0 libsystem_kernel.dylib 0x0000000190fe69a8 __pthread_kill + 8 (:-1) 1 libsystem_pthread.dylib 0x000000019109e440 pthread_kill + 208 (pthread.c:1670) 2 libsystem_c.dylib 0x0000000190f5f8dc __abort + 124 (abort.c:155) 3 libsystem_c.dylib 0x0000000190f5f860 abort + 132 (abort.c:126) 4 libc++abi.dylib 0x0000000190da1fe0 abort_message + 128 (:-1) 5 libc++abi.dylib 0x0000000190d92be8 demangling_terminate_handler() + 300 6 libobjc.A.dylib 0x0000000190cda7d4 _objc_terminate() + 124 (objc-exception.mm:498) 7 FirebaseCrashlytics 0x0000000105118754 FIRCLSTerminateHandler() + 340 (FIRCLSException.mm:452) 8 libc++abi.dylib 0x0000000190da15c0 std::__terminate(void (*)()) + 12 (:-1) 9 libc++abi.dylib 0x0000000190da1570 std::terminate() + 52 10 libdispatch.dylib 0x0000000190e60798 _dispatch_client_callout + 36 (object.m:563) 11 libdispatch.dylib 0x0000000190e6dd34 _dispatch_main_queue_drain + 892 (queue.c:7794) 12 libdispatch.dylib 0x0000000190e6d9a8 _dispatch_main_queue_callback_4CF + 40 (queue.c:7954) 13 CoreFoundation 0x000000019142b038 __CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE__ + 12 (CFRunLoop.c:1780) 14 CoreFoundation 0x000000019142569c __CFRunLoopRun + 2080 (CFRunLoop.c:3147) 15 CoreFoundation 0x0000000191424a3c CFRunLoopRunSpecific + 584 (CFRunLoop.c:3418) 16 GraphicsServices 0x00000001980cab0c GSEventRunModal + 160 (GSEvent.c:2196) 17 UIKitCore 0x00000001da6fe6ec -[UIApplication _run] + 868 (UIApplication.m:3782) 18 UIKitCore 0x00000001da702bc4 UIApplicationMain + 148 (UIApplication.m:5372) 19 MyApp 0x0000000104418268 main + 176 (main.swift:12) 20 dyld 0x00000001ddd81744 start + 1832 (dyldMain.cpp:1165) The code is: @objc public dynamic var player: AVPlayer? { willSet { removeThumbnails() } didSet { DispatchQueue.main.async { [weak self] in guard let self else { return } self.removePlayerObservers(oldValue) self.addPlayerObservers(self.player) } } } func removePlayerObservers(_ player: AVPlayer?) { if let periodicTimeObserver = periodicTimeObserver { player?.removeTimeObserver(periodicTimeObserver) self.periodicTimeObserver = nil } } What could be the problem? Thank you
Posted Last updated
.
Post not yet marked as solved
0 Replies
590 Views
Hey guys! I have a question about PiP(Picture in Picture) mode. Do we have some possible solution in case to hide controls like play/pause, step forward buttons using AVPictureInPictureController in UIKit? I know that we have option to set requiresLinearPlayback = true. Using it, we just disable our controls. I found possible solution just setting: pipController.setValue(1, forKey: "requiresLinearPlayback"). It seems to be part of private API, and I'm not sure if it'll pass AppStore review. I'm looking forward to some advice in that case, and how can I handle it.
Posted Last updated
.
Post not yet marked as solved
1 Replies
588 Views
Hi guys, Setting AVPlayerViewController.transportBarCustomMenuItems is not working on tvOS. I still see 2 icons for Audio and Subtitles. let menuItemAudioAndSubtitles = UIMenu( image: UIImage(systemName: "heart") ) playerViewController.transportBarCustomMenuItems = [menuItemAudioAndSubtitles] WWDC 2021 video is insufficient to make this work. https://developer.apple.com/videos/play/wwdc2021/10191/ The video doesn't say what exactly I need to do. Do I need to disable subtitle options? viewController.allowedSubtitleOptionLanguages = [] This didn't work and I still see the default icon loaded by the player. Do I need to create subclass of AVPlayerViewController? I just want to replace those 2 default icons by 1 icon as a test, but I was unsuccessful after many hours of work. Is it mandatory to define child menu items to the main item? Or do I perhaps need to define UIAction? The documentation and video are insufficient in providing guidance how to do that. I did something like this before, but that was more than 3 years ago and audi and subtitles was showing at the top of the player screen as tabs, if I rememebr correctly. Is transportBarCustomMenuItems perhaps deprecated? Is it possible that when loading AVPlayerItem and it detects audi and subtitles in the stream, it automatically resets AVPlayerViewController menu? How do I suppress this behavior? I'm currently loading AVPlayerViewController into SwiftUI interface. Is that perhaps the problem? Should I write SwiftUI player overlay from scratch? Thanks, Robert
Posted Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Due to legal restrictions I need to prevent my app's users from skipping and fast-forwarding the content that is played by AVPlayerViewController. I use playerViewController(:willResumePlaybackAfterUserNavigatedFrom:to:) and playerViewController(:timeToSeekAfterUserNavigatedFrom:to:) delegate methods to control the skipping behaviour. However, those delegate methods are only triggered for skip +/- 10, but not for fast-forwarding/rewinding.  Is there a way to prevent fast-forwarding in addition to skipping in AVPlayerViewController? Here is an example of the code I use: class ViewController: UIViewController {   override func viewDidAppear(_ animated: Bool) {     super.viewDidAppear(animated)     setUpPlayerViewController()   }   private func setUpPlayerViewController() {     let playerViewController = AVPlayerViewController()     playerViewController.delegate = self guard let url = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8") else {       debugPrint("URL is not found")       return     }     let playerItem = AVPlayerItem(url: url)     let player = AVPlayer(playerItem: playerItem)     playerViewController.player = player     present(playerViewController, animated: true) {       playerViewController.player?.play()     }   } } extension ViewController: AVPlayerViewControllerDelegate {   public func playerViewController(_ playerViewController: AVPlayerViewController, willResumePlaybackAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) { // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:willResumePlaybackAfterUserNavigatedFrom:to:)")   }   public func playerViewController(_ playerViewController: AVPlayerViewController, timeToSeekAfterUserNavigatedFrom oldTime: CMTime, to targetTime: CMTime) -> CMTime {     // Triggered on skip +/- 10, but not on fast-forwarding/rewinding     print("playerViewController(_:timeToSeekAfterUserNavigatedFrom:to:)")     return targetTime   } }
Posted Last updated
.
Post not yet marked as solved
0 Replies
636 Views
In our application, we play video-on-demand (VOD) content and display subtitles in different languages. The format we prefer for subtitles is WebVTT. We are planning to enhance caption styling (text color, background color, font weight, etc.) in WebVTT files. In our current flow, subtitles and images are loaded in 6-second chunks. Below is an example of one of the subtitle parts we use: WEBVTT X-TIMESTAMP-MAP=MPEGTS:0,LOCAL:00:00:00.000
Posted
by rushly.
Last updated
.