Application Crashed: com.apple.main-thread
EXC_BAD_ACCESS KERN_INVALID_ADDRESS 0x000000000000001e
Crashed: com.apple.main-thread
0 libobjc.A.dylib 0x2df58 object_isClass + 16
1 Foundation 0x1c9bc KVO_IS_RETAINING_ALL_OBSERVERS_OF_THIS_OBJECT_IF_IT_CRASHES_AN_OBSERVER_WAS_OVERRELEASED_OR_SMASHED + 76
2 Foundation 0x1bd60 NSKeyValueWillChangeWithPerThreadPendingNotifications + 300
3 AVFoundation 0x1380 -[AVPlayerAccessibility willChangeValueForKey:] + 72
4 AVFCore 0x13954 -[AVPlayer _noteNewPresentationSizeForPlayerItem:] + 48
5 AVFCore 0x1fbb0 __avplayeritem_fpItemNotificationCallback_block_invoke + 4336
6 libdispatch.dylib 0x26a8 _dispatch_call_block_and_release + 32
7 libdispatch.dylib 0x4300 _dispatch_client_callout + 20
8 libdispatch.dylib 0x12998 _dispatch_main_queue_drain + 984
9 libdispatch.dylib 0x125b0 _dispatch_main_queue_callback_4CF + 44
10 CoreFoundation 0x3701c CFRUNLOOP_IS_SERVICING_THE_MAIN_DISPATCH_QUEUE + 16
11 CoreFoundation 0x33d28 __CFRunLoopRun + 1996
12 CoreFoundation 0x33478 CFRunLoopRunSpecific + 608
13 GraphicsServices 0x34f8 GSEventRunModal + 164
14 UIKitCore 0x22c62c -[UIApplication _run] + 888
15 UIKitCore 0x22bc68 UIApplicationMain + 340
16 UIKitCore 0x4563d0 __swift_destroy_boxed_opaque_existential_1Tm +
12220
17 AajTak 0x84c4 main + 4333552836 (QuizLeaderboardViewModel.swift:4333552836)
com.livingMedia.AajTakiPhone_issue_4e4b5f148b75496175c3900a1405bd62_crash_session_3ff23a3e8e854c4ab68de2789fe76c5b_DNE_0_v2_stacktrace.txt
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hi Team,
We see an issue with this version if CoreMedia requesting multiple qualities at all times for a stream. We don't see this issue on 1.0.0.21C62. We are unsure what would be causing this.
[2024-01-05 16:53:51] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1145 2529 1090199 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=0 HTTP/1.0" 200 1146 2396 1013356 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_0.fmp4 HTTP/1.0" 200 1139 24975 1013385 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1145 2603 998670 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:52] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_1.fmp4 HTTP/1.0" 200 1138 40534 998739 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1145 2677 835327 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_2.fmp4 HTTP/1.0" 200 1138 57656 835207 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=1 HTTP/1.0" 200 1146 2458 986038 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:53] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_1.fmp4 HTTP/1.0" 200 1139 24700 986032 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1145 2751 1013257 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_3.fmp4 HTTP/1.0" 200 1138 55900 1013324 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=2 HTTP/1.0" 200 1146 2520 1016693 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:54] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_2.fmp4 HTTP/1.0" 200 1139 25014 1016717 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1145 2825 917753 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_4.fmp4 HTTP/1.0" 200 1138 103745 917903 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=3 HTTP/1.0" 200 1146 2582 958102 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:55] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_3.fmp4 HTTP/1.0" 200 1139 24782 958195 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1145 2899 931101 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_720p/v_1058_2452140000_630_5.fmp4 HTTP/1.0" 200 1138 112113 931228 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=4 HTTP/1.0" 200 1146 2644 935550 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:56] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_4.fmp4 HTTP/1.0" 200 1139 24824 937720 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/audio.m3u8?_HLS_msn=630&_HLS_part=5 HTTP/1.0" 200 1146 2706 895680 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_1080p/a_1459_2452140264_630_5.fmp4 HTTP/1.0" 200 1139 24843 895734 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
[2024-01-05 16:53:57] "GET /live/cinecanal/live/cinecanal_720p/video.m3u8?_HLS_msn=631&_HLS_part=0 HTTP/1.0" 200 1145 2529 907045 "-" "AppleCoreMedia/1.0.0.21B101 (iPhone; U; CPU OS 17_1_2 like Mac OS X; en_us)"
I’m using the new ApplicationMusicPlayer support on macOS 14 and playing items from my Apple Music library. I wanted to play this music from my app to an AirPlay destination so i added an AVRoutePickerView. However, selecting any destination via this view doesn’t make a difference to the playback. It continues to play on my mac speakers no matter which airplay destination i choose.
Also submitted as FB13521393.
I have downloaded the official Apple MusicKit SDK for Android and integrated 2 AARs it has in my app (musickitauth-release-1.1.2.aar and mediaplayback-release-1.1.1.aar). When I try to build my app I'm getting an error:
Manifest merger failed : android:exported needs to be explicitly specified for element <activity#com.apple.android.sdk.authentication.SDKUriHandlerActivity>. Apps targeting Android 12 and higher are required to specify an explicit value for android:exported when the corresponding component has an intent filter defined. See https://developer.android.com/guide/topics/manifest/activity-element#exported for details.
Which makes sense, since when I look into the AAR's AndroidManifest.xml I see that this is missing in SDKUriHandlerActivity. Can this be fixed?
I'm writing an Android app that uses the Apple MusicKit SDK for Android. I am trying to understand how to handle the Apple Music user token, once I got it from authentication flow. I don't know when the token will expire, it is not a regular jwt token, so I cannot check the expiration date. And I don't want to run the auth flow on every app run, it will be annoying for the users. Any guidance on how to handle and invalidate apple music user tokens?
there is a method setPreferredInput in AVAudioSession which can be used to select different input device. But, does there any similar function like "setPerferredOutput" so that in my APP I can select a specific audio output device to play audio ?
I do not want user to change it through system interfaces (such as the Control Center), but by logic inside APP.
thanks!
Hello all,
I am a completely new to iOS development, and I need to make an app that will be playing encrypted content using Fairplay. Right now I am using the Sample client in the latest FPS Server SDK (HLSCatalog), and from what I understand the default sources in Streams.plist are not protected. There's an entry with is_protected=YES and where I can replace the playlist_url, but I was wondering if I could use the .m3u8 files in the FPS Test Content found at https://developer.apple.com/streaming/fps/ ? If so, I don't really know how to.
In one of the .m3u8 files of the Test Content, I could find the "content_key_id_list" example value that's also mentioned in the client's README (skd://twelve), which is why I'm asking.
Thanks in advance!
I am creating a camera app where I would like music from another app (Apple Music, Spotify, etc.) to continue playing once the app is opened. Currently I am using .mixWithOthers to do this in my viewDidLoad.
let audioSession = AVAudioSession.sharedInstance()
do {
try audioSession.setCategory(AVAudioSession.Category.playback, options: [.mixWithOthers])
try audioSession.setActive(true)
} catch {
print("error trying to record and play audio")
}
However I am running into an issue where the music only plays if you resume music playback once you start recording a video. Otherwise, when you open the app music will stop when you see the preview. The interesting thing is that if you start playing music while recording, then once you stop music continues to play in the preview view. If you close the app (not force close) and reopen then music play back continues as expected. However, once you force close the app then it returns to the original behavior. I've tried to do research on this and I have not been able to find anything. Any help is appreciated. Let me know if more details are needed.
Hi! I've been working on a project in python that pulls in a bunch of my personal apple music playback history and library, etc.
I can't find a single good/functional example on how to pull the Music User Token via the android method or MusicKit JS (web) - I've spent a lot of hours on this today, and no permutation of existing examples/documentation has worked.
Any guidance would be much appreciated!! If you have a web app that pulls the music user token, I just need help understanding how to get to the token itself.
Thank you!
I have applied some filters (like applyingGaussianBlur) to a CIImage that was converted from UIImage. The resulting image data gets corrupted only in lower end devices. What could be the reason?
This can be reproduced easily with XCode's generated AUv3-Extension Projects.
For MIDI Processor type AUv3-Extensions, the contextName property is only set once during initializing when added as a MIDI FX within Logic Pro, but not after changing the track's name manually.
For Music Effect type AUv3-Extensions, contextName is set initially when added as an Audio FX within Logic Pro as well as updated as expected after changing the tracks's name manually.
Am I missing something or is this a Logic Pro bug?
Thanks,
Tobias
f I attached a FairPlay content key using AVSampleBufferAttachContentKey to my CMSampleBuffer, can I then render it using AVSampleBufferDisplayLayer
I am creating an app where you can record a video and listen to music in the background. At the top of my viewDidLoad I set the AVAudioSession Category to .playAndRecord
let audioSession = AVAudioSession.sharedInstance()
AVCaptureSession().automaticallyConfiguresApplicationAudioSession = false
do {
try audioSession.setCategory(AVAudioSession.Category.playAndRecord, options: [.mixWithOthers, .allowAirPlay, .allowBluetoothA2DP])
try audioSession.setActive(true)
} catch {
print("error trying to record and play audio")
}
However when I do this the audio cuts out for a second or less at app open and app close. I would like the audio to continue playing and not cutout. Is there anything I can do to ensure the audio continues to play?
I found that didOutputSampleBuffer would not be called for long time when the screen is no change. It sometimes make me confused for if there something wrong.
In my design, it will change to other screen shot method, such as CGWindowListCreateImage, when long time no data. But this is not what I expected.
I set the minimumFrameInterval to 30 but it seems no work.
[config setMinimumFrameInterval:CMTimeMake(1, 30)];
Is there any settings that can let me get a didOutputSampleBuffer, even given a CMSampleBufferRef with status SCFrameStatusIdle, called atlest one time per second? Which would make me think it works fine without any exception.
I'm working on a custom spatial video player that uses AVSampleBufferDisplayLayer as render layer. When I feed it with CMSampleBuffer that output from VTCompressionSession using new encoding API it can display normally but I don't know if it can work in VisionPro. Anyone has idea?
Hi,
I am looking at display some spatial video content captured on iPhone 15 Pros in a side-by-side format. I've read the HEVC Stereo Video Profile provided by Apple, but I am confused on access the left and right eye video. Looking at the AVAsset track information, there is one video track, one sound, and three metadata ones.
Apple's document references them as layers, but I am unsure how to access them. Could anyone provide some guidance on the access of them?
Thanks,
Will
Hello, I am experiencing an unexpected issue related to PhotosKit, where an occasional crash is occurring in an area it shouldn't. I've provided the crash log below for reference.
The crash can be recreated by calling PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) with random identifiers, as demonstrated below:
let cloudIdentifiers = (0...10).map { _ in PHCloudIdentifier(stringValue: UUID().uuidString) }
let localIdsMap = PHPhotoLibrary.shared().localIdentifierMappings(for: cloudIdentifiers)
I've noticed that the method seems to crash consistently when an incorrect cloud identifier is input. This is surprising to me since the return type is [PHCloudIdentifier : Result<String, Error>], so I was anticipating an explicit error.
The issue can be reproduced on both Xcode 15.0 & iOS 17.2 and Xcode 14.3.1 & iOS 16.0.
While I'm fairly certain that only valid identifiers are used in my app, I would still like to know if there is a potential way to validate a cloud identifier prior to accessing this method?
I'm currently working on live screen broadcasting app which allows the user's to record their screen to save a mp4 video. I write video file by AVAssetWriter, and it works fine. But, when there is 1GB-2BG of storage space remaining on the device, errors such as "Attempted to start an invalid broadcast session" frequently occur, and video files cannot be played due to not call assetWriter.finishWriting().
Occur on device:
iPhone se3
iPhone 12 pro max
iPhone 13
iPad 19
iPad air 5
I have tried the movieFragmentInterval of AVAssetWriter to write movie fragments , set shouldOptimizeForNetworkUse true/false , not working.The video can not be played.
I want to known how to observe or catch this error? Thanks!
Hello, I am experiencing an unexpected issue related to PhotosKit, where an occasional crash is occurring in an area it shouldn't. I've provided the crash log below for reference below:
Last Exception Backtrace:
0 CoreFoundation 0x1aa050860 __exceptionPreprocess + 164
1 libobjc.A.dylib 0x1a2363c80 objc_exception_throw + 60
2 CoreFoundation 0x1a9f68218 -[__NSDictionaryM setObject:forKeyedSubscript:] + 1184
3 Photos 0x1c11cb178 -[PHCloudIdentifierLookup _lookupCodeSpecificCloudIdentifierStrings:forIdentifierCode:] + 572
4 Photos 0x1c11cb278 -[PHCloudIdentifierLookup _lookupLocalIdentifiersForIdentifierCode:codeSpecificCloudIdentifierStrings:] + 92
5 Photos 0x1c11cbbec __69-[PHCloudIdentifierLookup lookupLocalIdentifiersForCloudIdentifiers:]_block_invoke + 88
6 CoreFoundation 0x1a9f67d70 NSDICTIONARY_IS_CALLING_OUT_TO_A_BLOCK + 24
7 CoreFoundation 0x1a9f67bec -[__NSDictionaryM enumerateKeysAndObjectsWithOptions:usingBlock:] + 288
8 Photos 0x1c11cbb40 -[PHCloudIdentifierLookup lookupLocalIdentifiersForCloudIdentifiers:] + 488
9 Photos 0x1c125e6e4 -[PHPhotoLibrary(CloudIdentifiers) localIdentifierMappingsForCloudIdentifiers:] + 96
10 Photos 0x1c1104f18 0x1c1101000 + 16152
"exceptionReason" : {"arguments":["-[__NSDictionaryM setObject:forKeyedSubscript:]"],"format_string":"*** %s: key cannot be nil","name":"NSInvalidArgumentException","type":"objc-exception","composed_message":"*** -[__NSDictionaryM setObject:forKeyedSubscript:]: key cannot be nil","class":"NSException"},
The crash can be recreated by calling PHPhotoLibrary.cloudIdentifierMappings(forLocalIdentifiers:) with random identifiers, as demonstrated below:
let cloudIdentifiers = (0...10).map { _ in PHCloudIdentifier(stringValue: UUID().uuidString) } let localIdsMap = PHPhotoLibrary.shared().localIdentifierMappings(for: cloudIdentifiers)
I've noticed that the method seems to crash consistently when an incorrect cloud identifier is input. This is surprising to me since the return type is [PHCloudIdentifier : Result<String, Error>], so I was anticipating an explicit error.
The issue can be reproduced on both Xcode 15.0 & iOS 17.2 and Xcode 14.3.1 & iOS 16.0.
While I'm fairly certain that only valid identifiers are used in my app, I would still like to know if there is a potential way to validate a cloud identifier prior to accessing this method?
I have a question about the Apple Music preview app for Windows 11.
It has a setting called Sound Check.
Is that feature available on the Apple Music web player and the Apple Music Android app?
If not, is that a planned feature for those?