Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

All subtopics

Post

Replies

Boosts

Views

Activity

AVPlayer can't resume after stalling
I have repeatedly checked if you limit the connection speed to a host with a video file (mp4), it brings that AVPlayer is stalled, but after you return the high speed connection to the host, the player does not resume playback. If you check the status, no errors, just the empty buffer: AVPlayer.error is nil. AVPlayerItem.error is nil. AVPlayerItem.isPlaybackBufferEmpty is true AVPlayerItem.isPlaybackLikelyToKeepUp is false Even if you try to wait a lot of time nothing happens and tapping play button it doesn't help as well. The player is frozen forever. Only if you call "seek" or call "playImmediately" method the player is unfrozen and resume playback. It happens not all the time, maybe one time from four. It seems like AVPlayer has a bug. What do you think?
0
0
548
Dec ’23
使用VideoTool库进行h264/h265编码
我使用VideoToolBox库进行编码后再通过NDI发送到网络上,是可以成功再苹果电脑上接收到ndi源屏显示画面的,但是在windows上只能ndi源名称,并没有画面显示。 我想知道是不是使用VideoToolBox库无法在windows上进行正确编码,这个问题需要如何解决
0
0
583
Dec ’23
Stereo video HLS
I am trying to set up HLS with MV HEVC. I have an MV HEVC MP4 converted with AVAssetWriter that plays as a "spatial video" in Photos in the simulator. I've used ffmpeg to fragment the video for HLS (sample m3u8 file below). The HLS of the mp4 plays on a VideoMaterial with an AVPlayer in the simulator, but it is hard to determine if the streamed video is stereo. Is there any guidance on confirming that the streamed mp4 video is properly being read as stereo? Additionally, I see that REQ-VIDEO-LAYOUT is required for multivariant HLS. However if there is ONLY stereo video in the playlist is it needed? Are there any other configurations need to make the device read as stereo? Sample m3u8 playlist #EXTM3U #EXT-X-VERSION:3 #EXT-X-TARGETDURATION:13 #EXT-X-MEDIA-SEQUENCE:0 #EXTINF:12.512500, sample_video0.ts #EXTINF:8.341667, sample_video1.ts #EXTINF:12.512500, sample_video2.ts #EXTINF:8.341667, sample_video3.ts #EXTINF:8.341667, sample_video4.ts #EXTINF:12.433222, sample_video5.ts #EXT-X-ENDLIST
5
1
1.8k
Dec ’23
Photos sample app can't access full-resolution photos on iOS
I'm running this SwiftUI sample app for photos without any modifications except for adding my developer profile, which is necessary to build it. When I tap on the thumbnail to see the photo library (after granting access to my photo library), I see that some of the thumbnails are stuck in a loading state, and when I tap on thumbnails, I only see a low-resolution image (the thumbnail), not the full-resolution image that should load. In the console I can see this error that occurs when tapping on a thumbnail to see the full-resolution image: CachedImageManager requestImage error: The operation couldn’t be completed. (PHPhotosErrorDomain error 3164.) When I make a few modifications necessary to run the app as a native macOS app, all the thumbnails load immediately, and clicking on them reveals the full-resolution images.
2
0
1.3k
Dec ’23
I need a way to permantently disable Reactions from my app, ideally the universe too
So I've spent the last five years optimizing my video AI system so that it runs with less than 5% CPU while processing a 30fps video feed on a Macbook Pro M2, and everything is great, until Sonoma comes out, and I find myself consuming 40% CPU for the exact same workload. So I fire up Instruments, and the "heaviest stack trace" (see screenshot) turns out to be Espresso doing some completely unasked-for and absolutely useless processing on my video frames. I turn off Reactions, but nothing helps - the CPU consumptions stays at 40%. "Reactions" is nothing but a useless toy to please some WWDC keynote fanboys, I don't want it anywhere near my app or my users, and I especially do not want to take the blame for it pissing away the user's CPU cycles and battery. Now, how do I make it go away, for ever? Best regards Jacob
4
1
852
Dec ’23
Issue in playing fairplay video
We have to play some encrypted videos from server. In AVAssetResourceLoaderDelegate we got the ckc data correctly and responded with that. Then video just starts playing and stops immdediately. When we check the playerItem error description we got Error Domain=AVFoundationErrorDomain Code=-11819 "Cannot Complete Action" UserInfo={NSLocalizedDescription=Cannot Complete Action, NSLocalizedRecoverySuggestion=Try again later.}. Any one encountered this?
0
0
559
Dec ’23
No audio in generated mpeg4AppleHLS by AVAssetWriter
I'm trying to use AVCaptureSession and AVAssetWriter to convert video and audio from an iPhone's camera and microphone into a fragmented video file in AppleHLS format. Below is part of the code. It seems that the capture is successful, and I have confirmed that the data received with captureOutput() can be appended to videoWriterIput and audioWriterInput using append(). When executing audioWriterInput!.append(sampleBuffer), sampleBuffer has the following value, and it looks like the audio data has been passed to AssetWriter. sampleBuffer.duration : CMTime(value: 941, timescale: 44100, flags: __C.CMTimeFlags(rawValue: 1), epoch: 0) sampleBuffer.totalSampleSize : 1882 However, the final output init.mp4 and *.m4s do not contain Audio. (The video can be played without any problems.) Could you please tell me any problems or hints as to why Audio is not included? /// Capture Session let captureSession = AVCaptureSession() /// Capture Input var videoDevice: AVCaptureDevice? var audioDevice: AVCaptureDevice? /// Configure and Start Capture Session func startCapture() { // Start Configuration captureSession.beginConfiguration() // Setup Input Video videoDevice = self.defaultCamera(cameraSide: cameraSide) videoDevice!.activeVideoMinFrameDuration = CMTimeMake(value: 1, timescale: 30) let videoInput = try AVCaptureDeviceInput(device: videoDevice!) as AVCaptureDeviceInput captureSession.addInput(videoInput) // Setup Input Audio audioDevice = AVCaptureDevice.default(for: AVMediaType.audio) let audioInput = try AVCaptureDeviceInput(device: audioDevice!) as AVCaptureDeviceInput captureSession.addInput(audioInput) // Setup Output Video let videoDataOutput = AVCaptureVideoDataOutput() videoDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) videoDataOutput.alwaysDiscardsLateVideoFrames = true videoDataOutput.videoSettings = [ kCVPixelBufferPixelFormatTypeKey: Int(kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange)] as [String : Any] captureSession.addOutput(videoDataOutput) // Setup Output Audio let audioDataOutput = AVCaptureAudioDataOutput() audioDataOutput.setSampleBufferDelegate(self, queue: recordingQueue) captureSession.addOutput(audioDataOutput) //End Configuration captureSession.commitConfiguration() // Start Capture captureSession.startRunning() } private let assetWriter: AVAssetWriter? private let startTimeOffset: CMTime private var audioWriterInput: AVAssetWriterInput? private let videoWriterInput: AVAssetWriterInput? func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) { if assetWriter == nil { // AssetWriter assetWriter = AVAssetWriter(contentType: UTType(AVFileType.mp4.rawValue)!) self.startTimeOffset = CMTime(value: 1, timescale: 1) // Setup Input of Audio. let audioCompressionSettings: [String: Any] = [ AVFormatIDKey: kAudioFormatMPEG4AAC, AVSampleRateKey: 44_100, AVNumberOfChannelsKey: 1, AVEncoderBitRateKey: 128_000 ] audioWriterInput = AVAssetWriterInput(mediaType: .audio, outputSettings: audioCompressionSettings) audioWriterInput!.expectsMediaDataInRealTime = true assetWriter.add(audioWriterInput!) // Setup Input of Video. let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264 ] let videoCompressionSettings: [String: Any] = [ AVVideoCodecKey: AVVideoCodecType.h264, AVVideoWidthKey: 1280, AVVideoHeightKey: 720, AVVideoCompressionPropertiesKey: [ kVTCompressionPropertyKey_AverageBitRate: 1_024_000, kVTCompressionPropertyKey_ProfileLevel: kVTProfileLevel_H264_Baseline_AutoLevel ] ] videoWriterInput = AVAssetWriterInput(mediaType: .video, outputSettings: videoCompressionSettings) videoWriterInput!.expectsMediaDataInRealTime = true pixelBuffer = AVAssetWriterInputPixelBufferAdaptor(assetWriterInput: videoWriterInput!, sourcePixelBufferAttributes: [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]) assetWriter.add(videoWriterInput!) // Configure the asset writer for writing data in fragmented MPEG-4 format. assetWriter.outputFileTypeProfile = AVFileTypeProfile.mpeg4AppleHLS assetWriter.preferredOutputSegmentInterval = CMTime(seconds: 1.0, preferredTimescale: 1) assetWriter.initialSegmentStartTime = startTimeOffset assetWriter.delegate = self // start AssetWriiter startTime = CMSampleBufferGetPresentationTimeStamp(sampleBuffer) assetWriter.startWriting() assetWriter.startSession(atSourceTime: startTime) } let isVideo = output is AVCaptureVideoDataOutput if isVideo { if videoWriterInput.isReadyForMoreMediaData { videoWriterInput!.append(sampleBuffer) } }else{ if audioWriterInput!.isReadyForMoreMediaData { audioWriterInput!.append(sampleBuffer) } } } func assetWriter(_ writer: AVAssetWriter, didOutputSegmentData segmentData: Data, segmentType: AVAssetSegmentType, segmentReport: AVAssetSegmentReport?) { : : }
0
0
614
Dec ’23
Seeking Guidance on Creating a Playlist Navigation App with OAuth Authentication
Hello Apple Community, First, I'm asking this on the holidays, so happy holidays. Now I have some "fun coding time" and can do it because I have some holiday time. I'm new to this area and would greatly appreciate your expertise and guidance. Have mercy. I'm attempting to develop a simple application on my Mac OS to browse my playlists. However, I've encountered a few roadblocks that I struggle to navigate. I understand I need to implement two-factor authentication to access my playlists, for which an OAuth setup is required. This involves obtaining an Apple ID and a service ID and dealing with other complex elements. One particular challenge I'm facing is with the redirect URI in the OAuth process. It seems that it needs to be a valid domain, and I'm unsure if using a local server address like https://localhost:5000 would work. My goal is to create a basic Flask application that can interact with Apple's web authentication system, but I'm uncertain about the feasibility of this approach, given the domain restrictions. I would appreciate any advice or step-by-step guidance on the following points. What would be the simplest way to create an application (Swift, Python, or JavaScript) that can authenticate and enable browsing through my playlists? Any insights, tips, or examples you could share would be immensely helpful. I am eager to learn and look forward to your valuable suggestions. Anything step-by-step would be great, but I like to dream. Thank you so much for your time and assistance.
1
0
618
Dec ’23
Rejection for Guideline 2.5.1 - Performance - Software Requirements
Hello, We have a Photo Vault app. We were hiding users' photos behind a semi-functional calculator. But after rejection, we thought "decoy functionality" meant we needed to remove this fake calculator feature. We removed it, and tried many things to solve this issue, but couldn't understand what Apple wants us to change. We've been trying to contact Apple for more details, but they keep sending the same message every time. Helps appreciated. Here is the rejection message: . Your app uses public APIs in an unapproved manner, which does not comply with guideline 2.5.1 of the App Store Review Guidelines. Specifically, we found that your app uses a decoy functionality to hide a user’s photos, which is not an appropriate use of the Photos API. Since there is no accurate way of predicting how an API may be modified and what effects those modifications may have, Apple does not permit unapproved uses of public APIs in App Store apps. Next Steps Please revise your app to ensure that documented APIs are used in the manner prescribed by Apple. It would be appropriate to remove any features in your app that use a decoy functionality to hide a user's photos from your app. If there are no alternatives for providing the functionality your app requires, you can use Feedback Assistant to submit an enhancement request.
1
1
655
Dec ’23
MusicKit API issues
I'm trying to test the MusicMarathon and MusicAlbums tutorial/demo apps for MusicKit and half the endpoints do not work. As an example the MusicMarathon call to MusicRecentlyPlayedContainerRequest() just returns a 401. Everything I've done seems correct. I've got a fully authorized session and I have all the development keys successfully setup. Also it's not all API's as I can access the users Library, Just none of the recommendation and search endpoints seems to be working correctly. I'm running iOS 17.2 and Xcode 15.1 I'm pretty certain this is easily repeatable by just running the demo applications from the MusicKit documentation,
0
0
417
Dec ’23
Requesting Assistance with MusicKit Authorization and Account Monitor Registration Issue
2 Days and I am frustrated. I"ve crossed my T's and dotted my I's. Using musickit Error Attempted to register account monitor for types client is not authorized to access: {( "com.apple.account.iTunesStore" )} Offending Code var request = MusicLibraryRequest<MusicKit.Playlist>() request.sort(by: .lastPlayedDate, ascending: false) let response = try await request.response() Verified Custom IOS Target Properities Privacy - Media Library Usage Description Correct Bundle Identifier Checkbox AppServcies/Music Kit for App ID Please help! 2 days of racking my brain, just can't get passed error MusicKit does ask me to authorize Other code works let request = MusicRecentlyPlayedContainerRequest() let response = try await request.response() See Image
2
2
944
Dec ’23
Some album artwork from MPMediaItem display as nil
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this. let query = MPMediaQuery.albums() if let albumCollections = query.collections { albums = albumCollections } for album in albums { let artwork = album.representativeItem?.artwork print(artwork, artwork?.image(at: CGSize(width: 100, height: 100))) } Any help would be greatly appreciated. Thanks!
2
1
606
Jan ’24
ClassInfo Audio Unit Property not being set
I have a music player that is able to save and restore AU parameters using the kAudioUnitProperty_ClassInfo property. For non apple AUs, this works fine. But for any of the Apple units, the class info can be set only the first time after the audio graph is built. Subsequent sets of the property do not stick even though the OSStatus code is 0 upon return. Previously this had worked fine. But sometime, not sure when, the Apple provided AUs changed their behavior and is now causing me problems. Can anyone help shed light on this ? Thanks in advance for the help. Jeff Frey
0
0
725
Jan ’24
MusicKit: Error when performing a MusicLibrarySectionedRequest using some MusicLibraryRequestable types
When making a library sectioned request, some MusicLibraryRequestable types used result in an MusicKit.MusicLibraryRequestError being thrown. When a Playlist is used as the MusicLibrarySectionRequestable type, no other MusicLibraryRequestable type than Track can be used for the request. For others, Artist & Genre cannot be used. Is there a way to work around this issue? The (seemingly) equivalent functionality in MediaPlayer (MPMediaQuery and MPMediaGrouping) was very consistent and reliable. Full error info: MusicKit.MusicLibraryRequestError.invalidType, The operation couldn’t be completed. (MusicKit.MusicLibraryRequestError error 1.) Device and OS: iPhone 13 Pro, iOS 17.2.1
0
0
352
Jan ’24