Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.

Post

Replies

Boosts

Views

Activity

how mediafilesegmenter generat AES-128 HLS?
I am using below commad line to generate AES-128 HLS, mediafilesegmenter -iso-fragmented --encrypt-key-file=my.key -S -f /Volumes/Samsung/pattern/vision_pro/hls/*** /Volumes/Samsung/pattern/vision_pro/***.mov but it always generates SAMPLE-AES, even I removed -S #EXT-X-KEY:METHOD=SAMPLE-AES,URI="enc.key",IV=0x7316166d6a85f56f3d4606eaebc3aa44 How I can generate AES-128 HLS? Thanks.
0
0
509
Apr ’24
Old ScreenCaptureKit sample only shows black screen on Sonoma
I have old ScreenCaptureKit sample downloaded on Oct 2022. That sample worked on Oct 2022. But it does not work on Apr 2024 on Sonoma 14.4.1 M1 MacBook. It only shows black screen. I also download updated ScreenCaptureKit sample and test it. It works on Sonoma 14.4.1 M1 MacBook. I noticed latest sample have SCContentSharingPicker and other changes. I have my screen capture application based on old ScreenCaptureKit sample. My app only shows black screen. Do I have to add SCContentSharingPicker and SCContentSharingPickerObserver on my application for capturing screen on Sonoma? Old way of screen capture without SCContentSharingPicker is not supported anymore on Sonoma?
0
0
450
Apr ’24
"Remote call timed out" error when trying to play large collection of music items with MusicKit's ApplicationMusicPlayer
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying: "Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}})) It doesn't matter if songs are downloaded to the device or not. I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me. I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same. typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry let player = ApplicationMusicPlayer.shared let entries: [QueueEntry] = tracks .compactMap { guard let song = $0 as? Song else { return nil } return QueueEntry(song) } Task(priority: .high) { [player] in do { player.queue = .init(entries, startingAt: nil) try await player.play() // prepareToPlay failed } catch { print(error) } }
0
0
475
Apr ’24
Crash in QuartzCore - CA::Render::Encoder::grow(unsigned long) + 288
Hi Team, some of our users are getting crash in QuartzCore. But we are not sure the exact reason for it. Can you please help us in it? App is crashing in production. Xcode version - 15.0 Platform - iOS Below is the crash stack trace. Crashed: com.apple.main-thread 0 libsystem_kernel.dylib 0xa974 __pthread_kill + 8 1 libsystem_pthread.dylib 0x60ec pthread_kill + 268 2 libsystem_c.dylib 0x75b80 abort + 180 3 QuartzCore 0x98ba8 CA::Render::Encoder::grow(unsigned long) + 288 4 QuartzCore 0x97e50 CA::Render::Vector::encode(CA::Render::Encoder*) const + 112 5 QuartzCore 0x10a76c CA::Render::KeyframeAnimation::encode(CA::Render::Encoder*) const + 68 6 QuartzCore 0x975ec CA::Render::Array::encode(CA::Render::Encoder*) const + 172 7 QuartzCore 0x75204 CA::Context::commit_animation(CA::Layer*, CA::Render::Animation*, void*) + 236 8 QuartzCore 0x72998 CA::Layer::commit_animations(CA::Transaction*, double ()(CA::Layer, double, void*), void ()(CA::Layer, CA::Render::Animation*, void*), void ()(CA::Layer, __CFString const*, void*), CA::Render::TimingList* ()(CA::Layer, void*), void*) + 956 9 QuartzCore 0x2b930 invocation function for block in CA::Context::commit_transaction(CA::Transaction*, double, double*) + 148 10 QuartzCore 0x2b838 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 368 11 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 12 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 13 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 14 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 15 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 16 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 17 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 18 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 19 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 20 QuartzCore 0x2b7c4 CA::Layer::commit_if_needed(CA::Transaction*, void (CA::Layer*, unsigned int, unsigned int) block_pointer) + 252 21 QuartzCore 0x6f5b0 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 11212 22 QuartzCore 0x661bc CA::Transaction::commit() + 648 23 QuartzCore 0x65e64 CA::Transaction::flush_as_runloop_observer(bool) + 88 24 CoreFoundation 0x35d3c CFRUNLOOP_IS_CALLING_OUT_TO_AN_OBSERVER_CALLBACK_FUNCTION + 36 25 CoreFoundation 0x34738 __CFRunLoopDoObservers + 552 26 CoreFoundation 0x33e50 __CFRunLoopRun + 1028 27 CoreFoundation 0x33968 CFRunLoopRunSpecific + 608 28 GraphicsServices 0x34e0 GSEventRunModal + 164 29 UIKitCore 0x22aedc -[UIApplication _run] + 888 30 UIKitCore 0x22a518 UIApplicationMain + 340 31 SwiftUI 0x1033860 OUTLINED_FUNCTION_39 + 600 32 SwiftUI 0x10336a8 OUTLINED_FUNCTION_39 + 160 33 SwiftUI 0xc4f9fc get_witness_table 7SwiftUI4ViewRzlAA15ModifiedContentVyxAA30_EnvironmentKeyWritingModifierVySbGGAaBHPxAaBHD1__AgA0cI0HPyHCHCTm + 364 34 Evie Ring 0x324620 main + 10 (MovanoRingApp.swift:10) 35 ??? 0x1ad632d84 (Missing)
0
0
469
Apr ’24
Compile error on MyTarget-Swift.h, unknown class name SCContentSharingPickerObserver
I am working on ScreenCaptureKit sample with SCContentSharingPickerObserver. My Target is SwiftUI based and calling Objective-C class method. I added [MyTarget]-Bridging.h and [MyTarget]-Swift.h I got compile error of unknown class name SCContentSharingPickerObserver in [MyTarget]-Swift.h. But I do not know how to fix this error since [MyTarget]-Swift.h is Xcode generated file. I set macOS target is 14.0, swift language ver is 5 Anyone know how to fix this error or waiting for Xcode update?
0
0
428
Apr ’24
Using "include=albums" on the catalog/<storefront>/songs endpoint with a filter causes 504, gateway timeouts.
When accessing the REST API, If you apply "include=albums" to a 'catalog//songs' endpoint requests with a filter on ISRC, the API will, without fail, return a 504 error status. If you remove the 'include=albums' and/or replace it with something like 'include=artists' it works fine. This has been like this for months and we need to get album details back with these requests. Could the Apple team please respond and verify the issue as it's blocking production for us. Thanks.
0
0
488
Apr ’24
Empty space with Screen Capture kit full screen window recording
I am using Screen Capture Kit to capture the windows and record it. But from macOS Sanoma onwards I see a wired behaviour when I try to capture the window which is in Full screen mode. The CMSampleBuffer returned by Screen capture kit has empty space at the top of the full screen window content. The ContentRect attachment in CMSampleBuffer includes this empty space. So there is no way to know what is the actual window content in the CMSampleBuffer. In the CaptureCample sample code provided by Apple it does not enumerate the Full screen windows. I made a change in that to enumerate full screen windows. The issue is reproduced in that also. Attaching the Image of showing the empty space. Has anybody encountered this issue?
4
0
680
Apr ’24
HDR10 MVHECV can not play on Safari
Hi, just generated a HDR10 MVHEVC file, mediainfo is below: Color range : Limited Color primaries : BT.2020 Transfer characteristics : PQ Matrix coefficients : BT.2020 non-constant Codec configuration box : hvcC+lhvC then generate the segment files with below command: mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov then upload the segment files and prog_index.m3u8 to web server. just find that can not play the HLS stream on Safari... the url is http://ip/vod/prog_index.m3u8 just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file. above same mediafilesegmenter command and upload the files to web server. the new version of HLS stream is can play on Safari... Is there any way to play HLS PQ video on Safari. thanks.
1
1
509
May ’24
iOS 17 MusicKit Song lastPlayedDate is always nil
Hey, I've been trying to fetch my Apple Music recently played songs for an app I'm working on, and I want to access the lastPlayedDate field. If I'm not mistaken, this field should exist for a Song according to Apple's documentation: https://developer.apple.com/documentation/musickit/song/lastplayeddate However, whenever I try to fetch this data, the lastPlayedDate field is always nil. All the other data I'm looking for, however, seems to fetch without issue. Here's the code I'm using: //Request as described in Apple MusicKit //https://developer.apple.com/documentation/musickit/musicrecentlyplayedrequestable var request = MusicRecentlyPlayedRequest<Song>() request.limit=30 do { let response = try await request.response() let songs = response.items.compactMap { song -> RecentlyPlayedSong? in let songName = song.title let songArtist = song.artistName let songAlbum = song.albumTitle let artwork: MusicArtworkType let preview_url = song.previewAssets?.first?.url?.absoluteString if let appleMusicArtwork = song.artwork { print("Found a song, \(song) with lastPlayedDate \(song.lastPlayedDate)") artwork = .AppleMusic(appleMusicArtwork) return RecentlyPlayedSong(name: songName, artist: songArtist, album: songAlbum, artwork: artwork, preview_url: preview_url, lastPlayedDate: song.lastPlayedDate ?? Date()) } I'm trying to map the response into a custom struct I made, but here's a sample of what's getting printed to the logs: Found a song, Song(id: "1676362342", title: "pwdr Blu (feat. Brother.)", artistName: "Kx5, deadmau5 & Kaskade") with lastPlayedDate nil Found a song, Song(id: "881289980", title: "Worlds Apart (feat. Kerli)", artistName: "Seven Lions") with lastPlayedDate nil Found a song, Song(id: "1501540431", title: "What’s Done Is Done", artistName: "Seven Lions & HALIENE") with lastPlayedDate nil Even though I just listened to these songs a a few minutes ago. Anyone ever run into this issue before? Any settings I need to look at changing to get this to show?
1
0
529
May ’24
Generating an Apple Music API Developer Token
I'm trying to generate a developer token with the following code in nodeJS: const jwt = require('jsonwebtoken') const {TEAM_ID, KID, APPLE_PRIVATE_KEY } = require('./secret') const expiration = 36000; const currentTime = Math.floor(Date.now() /1000); const expirationTime = currentTime + expiration; const options = { algorithm: 'ES256', header :{ alg : "ES256", kid : KID } } const payload = { iss : TEAM_ID, iat : currentTime, exp : expirationTime } const newToken = jwt.sign(payload, APPLE_PRIVATE_KEY, options) console.log('1111111111111111111111' , newToken) When testing my newToken in curl -- I'm getting a 401 response. Please help.
0
0
521
May ’24
How to addObserver for currentPlaybackTime for the system music player?
I'm using the systemMusicPlayer to play music and want to update the playback time using addObserver forKeyPath. [self setMusicPlayer: [MPMusicPlayerController systemMusicPlayer]]; I've tried these two methods: [self addObserver:self forKeyPath:@"musicPlayer.currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer]; [self.musicPlayer addObserver:self forKeyPath:@"currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer]; I do get the initial values for currentPlaybackTime in: -(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context but I never get any calls when the player is playing the song (the whole point). If I set the currentPlaybackTime to a specific value (locating manually using a slider), I get calls with the values I set (useless since I know what I am setting them to). How are we supposed to track the playback time without just polling it constantly?
1
0
593
May ’24
How to detect the end of playback with the system music player?
Since iOS 12 it has become difficult to detect the end of playback using the system music player. In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped. In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused. Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference. It would be nice if they added a notification that playback was done (similar to the other players). Any suggestions?
0
0
535
May ’24
Screenshot with ScreenCaptureKit much larger than with Command-Shift-3
I am capturing a screenshot with SCScreenshotManager's captureImageWithFilter. The resulting PNG has the same resolution as the PNG taken from Command-Shift-3 (4112x2658) but is 10x larger (14.4MB vs 1.35MB). My SCStreamConfiguration uses the SCDisplay's width and height and sets the color space to kCGColorSpaceSRGB. I currently save to file by initializing a NSBitmapImageRep using initWithCGImage, then representing as PNG with representationUsingType NSBitmapImageFileTypePNG, then writeToFile:atomically. Is there some configuration or compression I can use to bring down the PNG size to be more closely in-line with a Command-Shift-3 screenshot. Thanks!
1
0
686
May ’24
AudioMidi.app / Music.app drift sync
When I connect my MacBook to my living room AirPort (older gen wallwart) via Music app, the music output in both rooms is synced. When I try to setup a Multi-Output Device in AudioMidi setup, I'm not able to get them synced. I'm outputting to the same devices, they're all on the same sample rate, and I've played with the various settings (Primary Clock Source and Drift Sync). What gives? How are these connections different? Intel MacBook Pro 2018 running Sonoma 14.5
1
0
627
May ’24
Filtering background process "windows" during capture
I am trying to take a screenshot of each running window on the screen independently (even if not in the foreground). I am using SCScreenshotManager's captureImageWithFilter and SCContentFilter's initWithDesktopIndependentWindow. When I do this for every SCWindow in SCShareableContent's windows, I do get all the actual windows, but also get many mostly blank windows or windows that no ordinary user would consider to be an actual window. These include dozens of windows with no title, some with "Focus Proxy" title, some for menu bar icons, the wallpaper, the desktop icons, etc. I've implemented a naive solution that filters all windows that have no title, owningApplications with no title or bundle ID in a hardcoded blocklist (e.g. "com.apple.controlcenter", "com.apple.wallpaper.agent") and that helps, but is far from robust and is naturally fragile. Is there a recommended way to distinguish actual application windows or an overall better approach here? Thanks!
1
0
565
May ’24