I still can't find any documentation on this, but have figured out it's caused by adding an AVPlayerViewController to an AirPlay screen.
if ([[UIScreen screens] count] > 1){
externalScreen = [[UIScreen screens] objectAtIndex:1];
externalWindow = [[UIWindow alloc] initWithFrame:[externalScreen bounds]];
externalWindow.screen = externalScreen;
externalView = [[UIView alloc]initWithFrame:[externalScreen bounds]];
movieAVPlayer = [[AVPlayerViewController alloc] init];
[externalView addSubview:movieAVPlayer.view];
[externalView bringSubviewToFront:movieAVPlayer.view];
[externalWindow addSubview:externalView];
}
I've tried changing all of these before adding it to the view, but I always get the above crash when adding.
movieAVPlayer.showsPlaybackControls = YES;
movieAVPlayer.player.allowsExternalPlayback = YES;
movieAVPlayer.player.usesExternalPlaybackWhileExternalScreenIsActive = YES;
Anyone have any ideas of what else to try?
Post
Replies
Boosts
Views
Activity
Here's how I'm doing it. So far so good.
-(void) MeTalkTPretty
{
if (@available(iOS 17.0, *)) {
[AVSpeechSynthesizer requestPersonalVoiceAuthorizationWithCompletionHandler:^(AVSpeechSynthesisPersonalVoiceAuthorizationStatus status) {
NSLog(@"AVSpeechSynthesis requestPersonalVoiceWCH returned %lu", (unsigned long)status);
if (status == AVSpeechSynthesisPersonalVoiceAuthorizationStatusAuthorized)
{
NSLog(@"Yippee! I can use personal voice");
// set this in case none of the voices are personal voice
voiceToUse = [AVSpeechSynthesisVoice voiceWithLanguage:AVSpeechSynthesisVoice.currentLanguageCode];
for (AVSpeechSynthesisVoice *thisVoice in [AVSpeechSynthesisVoice speechVoices])
{
if (thisVoice.voiceTraits == AVSpeechSynthesisVoiceTraitIsPersonalVoice)
{
voiceToUse = [AVSpeechSynthesisVoice voiceWithIdentifier:thisVoice.identifier];
NSLog(@"WINNER is personal voice %@ ", voiceToUse);
}
}
}
else
{
NSLog(@"darn, I can't use personal voice even though on iOS17");
voiceToUse = [AVSpeechSynthesisVoice voiceWithLanguage:AVSpeechSynthesisVoice.currentLanguageCode];
}
}];
}
else {
NSLog(@"Darn, I can't use personal voice because this device isn't running ios17");
voiceToUse = [AVSpeechSynthesisVoice voiceWithLanguage:AVSpeechSynthesisVoice.currentLanguageCode];
}
}
For anyone watching this thread, I'm interested in trying the nostr protocol for a better solution. No matter what I tried, I wasn't able to get the GameKit implementation to be consistent for more than 8 peers. At times, up to 32 would be great, but other times more than 8 would cause dropped connections or wildly delayed packets. I couldn't identify any correlated items like peer broadband quality or location (or how much caffeine they'd had that morning ) Success seemed totally random.
Which has led to my curiosity to try the nostr protocol but I don't see any iOS examples using nostr for peer to peer messaging.
Can anyone offer a link or advice?
I'm having this issue as well. I've observed a TestFlight install to a Series 3 watch will crash, while a TestFlight install of same app from same iPhone to a Series SE installs as expected.
BTW - if MPMusicPlayer is loaded with an array of Apple Music track StoreIDs with setQueueWithStoreIDs, play seems to progress as expected. Since I have several users with oldish iTunes libraries, I can't use setQueueWithStoreIDs.
I've found the MPMusicPlayerControllerNowPlayingItemDidChange notification unreliable in iOS more often than not. My code keeps track of the playing item, watches for end of play, reloads the queue with the next track and plays it. All manual, all the time. I'm using the same code in the Mac M1 app version, so I don't experience the same problem with this issue.
Wish I had better news or a better overall experienced to report. Sorry.
I added this code to my existing one second timer code and it works pretty well. As you mentioned, if running on an M1 Mac, pausing in the Music app will be overridden by this, but it functions as expected in iOS.
if ((previousPlaybackTime == musicPlayer.currentPlaybackTime) &&
([musicPlayer playbackState] == MPMusicPlaybackStatePlaying))
{
NSLog(@">>>update time is same as one second ago and state is playing so try calling play to trigger update");
musicPlayer.play;
}
Yikes. I'm really disappointed in not only this particular issue - I too have an existing app with existing users who are now mad at ME for something that Apple has changed without notice or intent - but in general with Apple's support to small developers.
Have others noticed a significant uptick in problems and downtick in response/help from Apple?
I want to be compassionate regarding possible workplace challenges for those at Apple who support developers, but right now, I'm left with a sinking feeling that we've been abandoned.
I'm still getting user reports of missing data and feel pretty certain this is the issue - though I still can't reproduce reliably.
Some longtime users are having their data completely disappear with no new usage patterns, which causes them to get pretty upset with me - the only other entity they imagine could be at fault.
Can anyone relay reliable guidance on whether or not this is a bug or a feature? I'm at a standstill with several agitated users.
Giving a bump. I'm still unable to get currentPlaybackTime from a track playing in MPMusicPlayerController while it's playing. Once it's paused, the correct time will be returned, but if started playing again, the time continues to be reported as the time of the previous pause. Here's the code I use in an iOS app running on a Mac M1. Anyone have any advice or is anyone else having the same trouble?
NSLog(@"current playtime is %f", [MPMusicPlayerController systemMusicPlayer].currentPlaybackTime);
Maybe related to "prewarm" mentioned here:
https://developer.apple.com/forums/thread/667959
I have a mature app - more than ten years on App Store - that reads a file containing user data, using NSFilemanager during UIApplication *)application didFinishLaunchingWithOptions. In the last few months, I've had several users who claim their data just disappears. I've made no changes to any code that touches the data, so I was perplexed.
This seems a likely culprit. Anyone else experiencing lost data from a file that is read during didFinishLaunchingWithOptions? I imagine that if the prewarm trick calls didFinish..., and the file system is not available, then my code assumes it's a first run and creates an empty user data file. Yikes.
Any idea of how to get this fixed?
I'm having the same trouble. One more clue for the Media Engineer (thanks in advance for your help!):
The "'App Name' would like to access Apple Music, your music and video activity, and your media library" dialog is presented each time the app starts on the Mac M1. In iOS, the Settings app allows access to this authorization (Settings App Name Media & Apple Music = ON) which persists the answer after the first run of the app on the device.
I suspect the authorization given when the dialog is presented does not persist on the Mac M1.
Hope this helps with an answer that helps us all.
Also, I'm not able to play an Apple Music track in the MPMusicPlayerController in an app on the Mac M1. And currentPlaybackTime returns a zero even when the music player is playing a local track.
I mention these in case they are all related.
iOS 14.4 RC is out. This bug is still present. I'm going to remove the feature from my apps that used this API. Not sure what's going on at the Bug Repair Shop, but my reputation as a developer can't wait any longer.
For those in the same boat, here's what I'm doing to replace the basic function, inspired by pochimen's post. It's messy but at least it works. (Something I'm a bit embarrassed to be a part of.)
Using the productID that should be passed to the black hole formerly known as
[MPMediaLibrarydefaultMediaLibrary] addItemWithProductID:
use
MPMusicPlayer setQueueWithStoreIDs:productID
musicPlayer.play
Wait for track to start playing by polling
musicPlayer.nowPlayingItem
to be sure it's set,
musicPlayer.nowPlayingItem.playbackDuration
to be sure it has a value and
musicPlayer.currentPlaybackTime > 0.1
to be sure the item has started playing.
(I know this seems overkill, but I've seen examples of one or two of these conditions being met yet the other(s) do not return a value.)
Once above conditions are met, find or create a playlist for my app and add the track with
[playlist addMediaItems:nowPlayingMediaItem completionHandler:
Not too pretty, but gets the job done. Kinda like me ;-)
I noticed no one has replied so I thought I'd offer my thoughts. I suspect this is related to this bug:
https://developer.apple.com/forums/thread/658557
that causes a similar freeze.
If it's related, good news: you're not alone || bad new: it's been an outstanding bug for over 4 months with no guidance or help from Apple on when / if it will be fixed.
Sorry to be the bearer of bad news, but know how tough it is to have asked a question and get no response.