Since iOS 12 it has become difficult to detect the end of playback using the system music player.
In earlier iOS versions, the now playing item would be set nil and you would receive a notification that the player stopped.
In iOS 12 and later, nowPlayingItem still contains the current song and the only notification you get is MPMusicPlayerControllerPlaybackStateDidChangeNotification with the playbackState set to MPMusicPlaybackStatePaused.
Pressing pause in my car (or any remote access) generates the same conditions making it difficult to correctly detect the difference.
It would be nice if they added a notification that playback was done (similar to the other players).
Any suggestions?
Media Player
RSS for tagFind and play songs, audio podcasts, audio books, and more from within your app using Media Player.
Posts under Media Player tag
73 Posts
Sort by:
Post
Replies
Boosts
Views
Activity
I'm using the systemMusicPlayer to play music and want to update the playback time using addObserver forKeyPath.
[self setMusicPlayer: [MPMusicPlayerController systemMusicPlayer]];
I've tried these two methods:
[self addObserver:self forKeyPath:@"musicPlayer.currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
[self.musicPlayer addObserver:self forKeyPath:@"currentPlaybackTime" options:NSKeyValueObservingOptionNew | NSKeyValueObservingOptionInitial context:&musicPlayer];
I do get the initial values for currentPlaybackTime in:
-(void)observeValueForKeyPath:(NSString *)keyPath ofObject:(id)object change:(NSDictionary *)change context:(void *)context
but I never get any calls when the player is playing the song (the whole point).
If I set the currentPlaybackTime to a specific value (locating manually using a slider), I get calls with the values I set (useless since I know what I am setting them to).
How are we supposed to track the playback time without just polling it constantly?
Hi everyone, I am having a problem on AVPlayer when I try to play some videos.
The video starts for a few seconds, but immediately after I see a black screen and in the console there is the following error:
<__NSArrayM 0x14dbf9f30>(
{
StreamPlaylistError = "-12314";
comment = "have audio audio-aacl-54 in STREAMINF without EXT-X-MEDIA audio group";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
},
{
"c-conn-type" = 1;
"c-severity" = 2;
comment = "Playlist parse error";
"cs-guid" = "871C1871-D566-4A3A-8465-2C58FDC18A19";
date = "2024-05-13 20:46:19 +0000";
domain = CoreMediaErrorDomain;
status = "-12642";
uri = "http://127.0.0.1:8080/master.m3u8";
}
)
Hi,
just generated a HDR10 MVHEVC file, mediainfo is below:
Color range : Limited
Color primaries : BT.2020
Transfer characteristics : PQ
Matrix coefficients : BT.2020 non-constant
Codec configuration box : hvcC+lhvC
then generate the segment files with below command:
mediafilesegmenter --iso-fragmented -t 4 -f av_1 av_new_1.mov
then upload the segment files and prog_index.m3u8 to web server.
just find that can not play the HLS stream on Safari...
the url is http://ip/vod/prog_index.m3u8
just checked that if i remove the tag Transfer characteristics : PQ when generating the MVHEVC file.
above same mediafilesegmenter command and upload the files to web server.
the new version of HLS stream is can play on Safari...
Is there any way to play HLS PQ video on Safari. thanks.
Hi,
just find that I can not play 3D movie stream: MV-HEVC/UHD/Dolby Vision/Dolby Digital/Dolby Atmos(website is https://developer.apple.com/streaming/examples)
when I click View 3D example(fMP4), have below issue: my MacOS is 14.4.1 and M2 chip.
I am using MusicKit ApplicationMusicPlayer to play music in my app. Everything works fine as long as I'm not playing large playlists that contain hundreds of songs. When I to play collection of songs that is larger than around 300 I'm always getting the error message saying:
"Prepare to play failed" UserInfo={NSDebugDescription=Prepare to play failed, NSUnderlyingError=0x121d42dc0 {Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Remote call timed out" UserInfo={NSDebugDescription=Remote call timed out}}}))
It doesn't matter if songs are downloaded to the device or not.
I am aware that there is another initializer for player's queue that accepts Playlist instances but in my app users can choose to sort playlist tracks in different order than the default and that makes using that initializer not feasible for me.
I tried everything I could think of, I tried to fall back on MPMusicPlayerController and pass array of MPMusicPlayerPlayParameters to it but the result was the same.
typealias QueueEntry = ApplicationMusicPlayer.Queue.Entry
let player = ApplicationMusicPlayer.shared
let entries: [QueueEntry] = tracks
.compactMap {
guard let song = $0 as? Song else { return nil }
return QueueEntry(song)
}
Task(priority: .high) { [player] in
do {
player.queue = .init(entries, startingAt: nil)
try await player.play() // prepareToPlay failed
} catch {
print(error)
}
}
I am using below commad line to generate AES-128 HLS,
mediafilesegmenter -iso-fragmented --encrypt-key-file=my.key -S -f /Volumes/Samsung/pattern/vision_pro/hls/*** /Volumes/Samsung/pattern/vision_pro/***.mov
but it always generates SAMPLE-AES, even I removed -S
#EXT-X-KEY:METHOD=SAMPLE-AES,URI="enc.key",IV=0x7316166d6a85f56f3d4606eaebc3aa44
How I can generate AES-128 HLS? Thanks.
Some users of our app get a black screen when playing mp4 files on their phones. The users are all running iOS 17.3.1 and cannot catch the error message when playing a black screen.
The user can play the video normally after restarting the mobile phone. However, we want to solve this problem from the code.
We use two Avplayers to complete the sequential playback of multiple videos.
override init(frame: CGRect) {
super.init(frame: frame)
layer.addSublayer(videoPlayerA)
layer.addSublayer(videoPlayerB)
}
lazy var layers = [videoPlayerA, videoPlayerB]
lazy var videoPlayerA: AVPlayerLayer = {
let avPlayerLayer = AVPlayerLayer(player: AVPlayer())
avPlayerLayer.contentsScale = UIScreen.main.scale
avPlayerLayer.videoGravity = .resizeAspectFill
return avPlayerLayer
}()
lazy var videoPlayerB: AVPlayerLayer = {
let avPlayerLayer = AVPlayerLayer(player: AVPlayer())
avPlayerLayer.contentsScale = UIScreen.main.scale
avPlayerLayer.videoGravity = .resizeAspectFill
return avPlayerLayer
}()
func play() {
if let path = paths[safe: currentIndex] {
currentVideoPath = path
currentPlayer = AVPlayer(playerItem: AVPlayerItem(url: URL(fileURLWithPath: path)))
layers.first?.player?.pause()
layer.insertSublayer(layers.first ?? AVPlayerLayer(), below: layers.last)
layers.swapAt(0, 1)
layers.first?.player = currentPlayer
layers.first?.player?.play()
SHPlayerGlobalStatus.sharedInstance.isLocalPlayerPlaying = true
try? AVAudioSession.sharedInstance().setCategory(.playback)
try? AVAudioSession.sharedInstance().setActive(true)
}
In SwiftUI there is a built-in component for displaying album artworks called Artwork but there is no equivalent for UIKit.
My current approach is to use the .url() method to read image's URL and download the image or read it from the disk but the performance is much worse than it was previously with MPMediaItem's artworkImage method.
let artworkQueue = DispatchQueue(
label: "MusicKit-ArtworkQueue",
qos: .default,
attributes: .concurrent
)
let artworkSemaphore = DispatchSemaphore(value: 5)
extension Song {
func artworkImage(for size: CGSize, completion: @escaping (UIImage?) -> Void) {
artworkQueue.async {
artworkSemaphore.wait()
defer {
artworkSemaphore.signal()
}
let imageURL = artwork?.url(
width: Int(size.width),
height: Int(size.height)
)
// I hate doing this as it might very well break in the future
guard let imageURL, imageURL.scheme == "musicKit"
else {
return completion(nil)
}
guard let imageData = try? Data(contentsOf: imageURL),
let image = UIImage(data: imageData) else {
return completion(nil)
}
completion(image)
}
}
}
I really dislike this approach because it feels hacky but somewhat works. You might ask what's the semaphore for? Well, without it I could notice that MusicKit was choking and after reading too many artworks at once.
Can someone from Apple please provide us with an example on how to use MusicKit with UIKit properly?
Ideally (IMO) we would have a method defined on Song and other MusicKit structures that returns the image for us, just like MPMediaItem had the .artwork() method. It would make our lives so much easier.
Hello,
Can we use below mentioned keys in our code :-
_MRMediaRemotePlayerPlaybackStateDidChangeNotification
kMRMediaRemotePlaybackStateUserInfoKey
Thanks
I am trying to create an app for a custom Now Playing UI. How can I grab the following:
Song Name
Album Name
Artist Name
Album Art
I want to create a CPListImageRowItem/CPGridTemplate for the audio app which supports from iOS 13. for iOS14 and above I can able to do that with CPGridTemplate/ CPListImageRowItem but in iOS 13 how can we do? Because to support audio app from iOS13 I am using MPPlayableContentManager. So, are we able to achieve it iOS 13?
Grid Should look like below image
Please Suggest any way to achieve this kind of design?
Thanks in advance.
I'm trying to accomplish creating a new playlist on device that appears in AppleMusic, and adding into the playlist a selection of MP3s within a small IOS app.
Now the MP3's are either a stream of bytes, or a flat file already stored on the device (the app itself generates these - they aren't downloaded, they are created in app, and then stored on the local device) in it's local storage space.
The idea is that created tracks can show up in a specific play list on the device.
Now, there appears to be some conflict as to what framework I need to use.
I've found MPMediaPlayer, which appears to allow me to create a playlist using the GetPlaylist call, although the documentation on this seems pretty sparse and there's not a lot of examples I can find on how to use this?
It looks like a UUID is passed in, but there is no documentation on what this UUID is or where it comes from? If I want to create a new Playlist, I presume I need to generate a UUID, and then store that locally in order to be able to access that playlist again later, yes?
There's an AddItem call which looks like it's how you add a track to a playlist, but there's no documentation on how you generate an entry. The documentation for this function talks about a Product ID, without describing what the product ID is, or where it needs to come from. Is this a GUID? Is it a name/description? Does it have to be unique? I'm assuming this Product ID refers to that which is being added to the playlist, but the documentation is sadly lacking in terms of explaining what the product ID refers to. Is it a media Item, or is that what is created when whatever entity the Product ID is referring to is added to the playlist?
I'm assuming I can create a NSURL of the file that is stored that is actually the MP3 sample, but what I do with that in order to actually add it as a playlist entry is unknown. I'm sure there is a mechanism to do this, it's just not clear what that is.
There's a lack of understanding or explanation of what the process is here, and some illumination would be helpful.
Hello everyone, my name is Joshua Osagie. For 2 months now, I have been trying to build my own music application, but unfortunately, I can’t because I was thinking I would get an API from maybe Apple Music or Spotify that will grant me access to over 100 million music. Even if the API is paid for, I have been doing my research, but then it's kind of impossible. So please, if anyone has an idea on what I can do to bring this application to life, I will really appreciate it. Or if anyone could share me an idea on how to get over millions of music on my app, I will be really grateful.
I play livestream thì bị lỗi : -12888 -"Playlist File unchanged for longer than 1.5 * target duration" , I also read error -12888 in the documentation page 170: https://docs.huihoo.com/apple/wwdc/ 2018/502_measuring_and_optimizing_hls_performance.pdf but still don't understand the reason. Please explain to me the reason for the error?
Hello everyone,
I was playing a livestream when I received the error -16831/START-TIME is too close to live returned from the AVPlayerItemNewErrorLogEntry function. I don't know why the error is returned.Can you explain to me the reason for this error?
Hello, I don't run a podcast, therefore, I am not referring to the apple podcasts connect platform, and have been trying to get in contact with someone at Apple Podcasts. I would like to talk to developer support or someone who could consult on how to best approach something i'd like to build as an open source tool. I listen to a lot of podcasts and would like an analytics dashboard and toolset to take notes from the podcasts that I listen to on Apple Podcasts. Although, it would be just a good start to have analytics, accessing all of the info. I need to be able to plug into an API and pull all of that data from my account. Is there any way I can access this or talk to someone about this? I have a lot of historical data I assume from all of the shows i'm subscribed to and would like to visualize all of this. Is this possible? From my research, it seems that there is no way to access the information from the Podcasts app? Is there any infra for this?
Running in a Mac (Catalyst) target or Apple Silicon (designed for iPad).
Just accessing the playbackStoreID from the MPMediaItem shows this error in the console:
-[ITMediaItem valueForMPMediaEntityProperty:]: Unhandled MPMediaEntityProperty subscriptionStoreItemAdamID.
The value returned is always “”.
This works as expected on iOS and iPadOS, returning a valid playbackStoreID.
import SwiftUI
import MediaPlayer
@main
struct PSIDDemoApp: App {
var body: some Scene {
WindowGroup {
Text("playbackStoreID demo")
.task {
let authResult = await MPMediaLibrary.requestAuthorization()
if authResult == .authorized {
if let item = MPMediaQuery.songs().items?.first {
let persistentID = item.persistentID
let playbackStoreID = item.playbackStoreID // <--- Here
print("Item \(persistentID), \(playbackStoreID)")
}
}
}
}
}
}
Xcode 15.1, also tested with Xcode 15.3 beta 2.
MacOS Sonoma 14.3.1
FB13607631
Hello everyone,
I was playing a livestream when I received the error -16831/START-TIME is too close to live returned from the AVPlayerItemNewErrorLogEntry function. I don't know why the error is returned, please help me explain.
Similarly, I am also getting error: -12888/Playlist File unchanged for longer than 1.5 * target duration , I also read error -12888 in the documentation page 170: https://docs.huihoo.com/apple/wwdc/ 2018/502_measuring_and_optimizing_hls_performance.pdf
but still don't understand the reason.
Hope you can help!
Come posso dare la mia idea per migliorare il cellulare?