I'm getting a variety of errors when I call prepareToPlay on the MPMusicPlayerController. Sometimes they happen, sometimes they don't. I'm trying to play songs from the Apple Music service. When I don't get the errors, it plays just fine. I have iOS v13.5.1 on my iPhone Xs and I'm using Xcode 11.5. This is my code:
let applicationMusicPlayer = MPMusicPlayerController.applicationMusicPlayer
applicationMusicPlayer.setQueue(with: [trackID])
applicationMusicPlayer.prepareToPlay(completionHandler:{ error in
if let error = error {
print(error.localizedDescription)
return
}
DispatchQueue.main.async{
applicationMusicPlayer.play()
}
}
These are the various errors I'm getting:
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=2 "Queue was interrupted by another queue" UserInfo={NSDebugDescription=Queue was interrupted by another queue}
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=9 "Preparing queue timed out" UserInfo={NSDebugDescription=Preparing queue timed out}
[SDKPlayback] Failed to prepareToPlay error: Error Domain=MPMusicPlayerControllerErrorDomain Code=6 "Failed to prepare to play" UserInfo={NSDebugDescription=Failed to prepare to play}
[SDKPlayback] applicationQueuePlayer _establishConnectionIfNeeded timeout [ping did not pong]
Explore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hello there!
I am trying to use PHPickerViewController to load videos, but I got a problem: I could load some videos only not all.
I refer to the existing thread - https://developer.apple.com/forums/thread/652695, but dosen't work.
This is the code I persent PHPickerViewController
var config = PHPickerConfiguration()
config.selectionLimit = 1
config.filter = .videos
config.preferredAssetRepresentationMode = .current
let picker = PHPickerViewController(configuration: config)
picker.delegate = self
present(picker, animated: true, completion: nil)
Below is the relevant implementation of the method: func picker(_ picker: PHPickerViewController, didFinishPicking results: [PHPickerResult]):
picker.dismiss(animated: true, completion: nil)
for result in results {
result.itemProvider.loadFileRepresentation(forTypeIdentifier: UTType.movie.identifier) { (url, error) in
if let error = error {
print(error)
return
}
guard let url = url else { return }
let fileName = "\(Date().timeIntervalSince1970).\(url.pathExtension)"
let newUrl = URL(fileURLWithPath: NSTemporaryDirectory() + fileName)
try? FileManager.default.copyItem(at: url, to: newUrl)
DispatchQueue.main.async {
self.playVideo(newUrl)
}
}
}
Before I print error in line 5, Xcode printed 3 lines of error:
[AXRuntimeCommon] Unknown client: TestPHPicker
[default] [ERROR] Could not create a bookmark: NSError: Cocoa 257 "The file couldn’t be opened because you don’t have permission to view it." }
Error copying file type public.movie. Error: Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}}
And I print error in line 5:
Error Domain=NSItemProviderErrorDomain Code=-1000 "Cannot load representation of type public.movie" UserInfo={NSLocalizedDescription=Cannot load representation of type public.movie, NSUnderlyingError=0x283a4a610 {Error Domain=NSCocoaErrorDomain Code=4101 "Couldn’t communicate with a helper application." UserInfo={NSUnderlyingError=0x283a48b10 {Error Domain=PHAssetExportRequestErrorDomain Code=2 "(null)" UserInfo={NSUnderlyingError=0x283a4a550 {Error Domain=CloudPhotoLibraryErrorDomain Code=82 "Failed to download CPLResourceTypeOriginal" UserInfo=0x28219b300 (not displayed)}}}}}}
For some videos I can load successfully, and some videos I got error. I don't know why this happened.
I am testing this on an iPhone X iOS 14.0(18A373). Xcode 12.0 (12A7209).
Thanks for help!
Hi,
for the implementation of an audio player with signed URL's, I need to be able to set an authorization header to the request for an AVURLAsset.
This works but not on Airplay when trying to stream multiple songs in a queue.
For each item I do:
let headerFields: [String: String] = ["Authorization": getIdToken()!]
super.init(url: url, options: ["AVURLAssetHTTPHeaderFieldsKey": headerFields])
But only the first 2 songs in the queue actually get this authorization header sent along, somehow it is removed for subsequent songs.
Any ideas on how I can fix this?
thanks,
Thomas
I have a USB audio interface that is causing kernel traps and the audio output to "skip" or dropout every few seconds. This behavior occurs with a completely fresh install of Catalina as well as Big Sur with the stock Music app on a 2019 MacBook Pro 16 (full specs below).
The Console logs show coreaudiod got an error from a kernel trap, a "USB Sound assertion" in AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644, and the Music app "skipping cycle due to overload."
I've added a short snippet from Console logs around the time of the audio skip/drop out. The more complete logs are at this gist:
https://gist.github.com/djflux/08d9007e2146884e6df1741770de5105
I've also opened a Feedback Assistant ticket (FB9037528):
https://feedbackassistant.apple.com/feedback/9037528
Does anyone know what could be causing this issue?
Thanks for any help.
Cheers,
Flux aka Andy.
Hardware Overview:
Model Name: MacBook Pro
Model Identifier: MacBookPro16,1
Processor Name: 8-Core Intel Core i9
Processor Speed: 2.4 GHz
Number of Processors: 1
Total Number of Cores: 8
L2 Cache (per Core): 256 KB
L3 Cache: 16 MB
Hyper-Threading Technology: Enabled
Memory: 64 GB
System Firmware Version: 1554.80.3.0.0 (iBridge: 18.16.14347.0.0,0)
System Software Overview:
System Version: macOS 11.2.3 (20D91)
Kernel Version: Darwin 20.3.0
Boot Volume: Macintosh HD
Boot Mode: Normal
Computer Name: mycomputername
User Name: myusername
Secure Virtual Memory: Enabled
System Integrity Protection: Enabled
USB interface: Denon DJ DS1
Snippet of Console logs
error 21:07:04.848721-0500 coreaudiod HALS_IOA1Engine::EndWriting: got an error from the kernel trap, Error: 0xE00002D7
default 21:07:04.848855-0500 Music HALC_ProxyIOContext::IOWorkLoop: skipping cycle due to overload
default 21:07:04.857903-0500 kernel USB Sound assertion (Resetting engine due to error returned in Read Handler) in /AppleInternal/BuildRoot/Library/Caches/com.apple.xbs/Sources/AppleUSBAudio/AppleUSBAudio-401.4/KEXT/AppleUSBAudioDevice.cpp at line 6644
...
default 21:07:05.102746-0500 coreaudiod Audio IO Overload inputs: 'private' outputs: 'private' cause: 'Unknown' prewarming: no recovering: no
default 21:07:05.102926-0500 coreaudiod CAReportingClient.mm:508 message {
HostApplicationDisplayID = "com.apple.Music";
cause = Unknown;
deadline = 2615019;
"input_device_source_list" = Unknown;
"input_device_transport_list" = USB;
"input_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2";
"io_buffer_size" = 512;
"io_cycle" = 1;
"is_prewarming" = 0;
"is_recovering" = 0;
"issue_type" = overload;
lateness = "-535";
"output_device_source_list" = Unknown;
"output_device_transport_list" = USB;
"output_device_uid_list" = "AppleUSBAudioEngine:Denon DJ:DS1:000:1,2";
}: (null)
The MusicKit video states that you just enable "MusicKit" in your application identifier and "you're done!"
Ok, so I did that, and I'm seeing the following error when trying to run a song query:
[DataRequesting] Failed retrieving MusicKit tokens: Error Domain=ICErrorDomain Code=-8200 "Media API Token Service's response was invalid (status code: Unauthorized (401))." UserInfo={NSDebugDescription=Media API Token Service's response was invalid (status code: Unauthorized (401))., NSUnderlyingError=0x6000023a0c60 {Error Domain=AMSErrorDomain Code=301 "Invalid Status Code" UserInfo={NSLocalizedDescription=Invalid Status Code, AMSURL=https://sf-api-token-service.itunes.apple.com/apiToken?REDACTED, AMSStatusCode=401, AMSServerPayload={
status = verificationFailure;
}, NSLocalizedFailureReason=The response has an invalid status code}}}. Throwing .developerTokenRequestFailed.
Is this just broken on Apple's side? Is there some other magic string that needs to be added to the plist other than
NSAppleMusicUsageDescription?
Hey there!
I'm trying to use MusicDataRequest to fetch the contents of a user's library.
Most of the documented endpoints I've tried seem to be working as expected, but the /me/library/artists and /me/library/albums endpoints are consistenty giving me a 500 Upstream Service Error.
Here's an example of my code, and the resulting error:
let url = URL(string: "https://api.music.apple.com/v1/me/library/albums")!
let request = MusicDataRequest(urlRequest: URLRequest(url: url))
do {
let response = try await request.response()
let string = String(data: response.data, encoding: .utf8)!
print("success: \(string)")
} catch {
print("error: \(error)")
}
MusicDataRequest.Error(
status: 500,
code: 50001,
title: "Upstream Service Error",
detailText: "Error fetching library content",
id: "5OFXMJAGNU2WCTDKNAYYP4BJXI",
originalResponse: MusicDataResponse(
data: 153 bytes,
urlResponse: <NSHTTPURLResponse: 0x0000000280f04dc0>
)
)
If I replace /albums with /songs or /playlists in the above code everything works as expected. Is there something I'm missing from the albums and artists requests? Or is this a bug with the API?
I'm slowly learning the new MusicKit beta for swift.
I've learned to successfully retrieve tracks of type Song using MusicDataRequest, using the following:
...
let countryCode = try await MusicDataRequest.currentCountryCode
if let url = URL(string: "https://api.music.apple.com/v1/catalog/\(countryCode)/songs?filter[isrc]=\(isrc)") {
let dataRequest = MusicDataRequest(urlRequest: URLRequest(url: url))
let dataResponse = try await dataRequest.response()
...
However, when I decode the data, there does not seem to be any album information that I can see.
I've tried adding includes=albums to the URL, but I don't think that's the right approach, because when I veiw the Song struct in MusicKit, I don't see a reference to an Album type anywhere.
Any advice on how to retrieve the album information would be most appreciated.
Thanks.
For PHPickerViewController, we know we can perform simple filtering by
var config = PHPickerConfiguration()
config.filter = PHPickerFilter.images
But, how about we only want to show images with format JPG & PNG, but excluding GIF? This is because our app doesn't support GIF.
Is it possible to do so?
I really love Quartz Composer from Apple which is a quite old app, not updated for years. It works well on my 2015 mid MacBook Pro, but not on new M1 iMac. Does anyone know how to run this great app on my new machine? Thank you!
our app meet a wired problem for online version. more and more user get 561145187 when try to call this code:
AudioQueueNewInput(&self->_recordFormat, inputBufferHandler, (__bridge void *)(self), NULL, NULL, 0, &self->_audioQueue)"
I search for several weeks, but nothing help.
we sum up all issues devices, found some similarity:
only happens on iPad OS 14.0 +
occurred when app started or wake from background (we call the code when app received "UIApplicationDidBecomeActiveNotification")
Any Idea why this happens?
I’m using AVAudioEngine to get a stream of AVAudioPCMBuffers from the device’s microphone using the usual installTap(onBus:) setup.
To distribute the audio stream to other parts of the program, I’m sending the buffers to a Combine publisher similar to the following:
private let publisher = PassthroughSubject<AVAudioPCMBuffer, Never>()
I’m starting to suspect I have some kind of concurrency or memory management issue with the buffers, because when consuming the buffers elsewhere I’m getting a range of crashes that suggest some internal pointer in a buffer is NULL (specifically, I’m seeing crashes in vDSP.convertElements(of:to:) when I try to read samples from the buffer).
These crashes are in production and fairly rare — I can’t reproduce them locally.
I never modify the audio buffers, only read them for analysis.
My question is: should it be possible to put AVAudioPCMBuffers into a Combine pipeline? Does the AVAudioPCMBuffer class not retain/release the underlying AudioBufferList’s memory the way I’m assuming? Is this a fundamentally flawed approach?
Is there a way to get the current queue items from an MPMusicPlayerController?
I need to know when the items I've set to the queue finish playing completely but cannot find any way to do this.
I am not using MusicKit but setting the queue via play parameters.
From what I can tell so far, after the queue finishes playing, it pauses and resets to the first item in the queue. So even after playback is done, there is no way to know that it finished on its own.
Hello,
I am wondering how one can play music videos (with the actual video playing) with the ApplicationMusicPlayer using MusicKit for Swift?
There is not much documentation on this, so any help would be appreciated.
I know that if you want background audio from AVPlayer you need to detatch your AVPlayer from either your AVPlayerViewController or your AVPlayerLayer in addition to having your AVAudioSession configured correctly.
I have that all squared away and background audio is fine until we introduce AVPictureInPictureController or use the PiP behavior baked into AVPlayerViewController.
If you want PiP to behave as expected when you put your app into the background by switching to another app or going to the homescreen you can't perform the detachment operation otherwise the PiP display fails.
On an iPad if PiP is active and you lock your device you continue to get background audio playback. However on an iPhone if PiP is active and you lock the device the audio pauses.
However if PiP is inactive and you lock the device the audio will pause and you have to manually tap play on the lockscreen controls. This is the same between iPad and iPhone devices.
My questions are:
Is there a way to keep background-audio playback going when PiP is inactive and the device is locked (iPhone and iPad)
Is there a way to keep background-audio playback going when PiP is active and the device is locked? (iPhone)
CVPixelBuffer.h defines
kCVPixelFormatType_420YpCbCr8BiPlanarVideoRange = '420v', /* Bi-Planar Component Y'CbCr 8-bit 4:2:0, video-range (luma=[16,235] chroma=[16,240]). baseAddr points to a big-endian CVPlanarPixelBufferInfo_YCbCrBiPlanar struct */
kCVPixelFormatType_420YpCbCr10BiPlanarVideoRange = 'x420', /* 2 plane YCbCr10 4:2:0, each 10 bits in the MSBs of 16bits, video-range (luma=[64,940] chroma=[64,960]) */
But when I set above format camera output, and I find the output pixelbuffer's value is exceed the range.I can see [0 -255] for 420YpCbCr8BiPlanarVideoRange and
[0,1023] for 420YpCbCr10BiPlanarVideoRange
Is it a bug or something wrong of the output?If it is not how can I choose the correct matrix transfer the yuv data to rgb?
I receive a buffer from[AVSpeechSynthesizer convertToBuffer:fromBuffer:] and want to schedule it on an AVPlayerNode.
The player node's output format need to be something that the next node could handle and as far as I understand most nodes can handle a canonical format.
The format provided by AVSpeechSynthesizer is not something thatAVAudioMixerNode supports.
So the following:
AVAudioEngine *engine = [[AVAudioEngine alloc] init];
playerNode = [[AVAudioPlayerNode alloc] init];
AVAudioFormat *format = [[AVAudioFormat alloc]
initWithSettings:utterance.voice.audioFileSettings];
[engine attachNode:self.playerNode];
[engine connect:self.playerNode to:engine.mainMixerNode format:format];
Throws an exception:
Thread 1: "[[busArray objectAtIndexedSubscript:(NSUInteger)element] setFormat:format error:&nsErr]: returned false, error Error Domain=NSOSStatusErrorDomain Code=-10868 \"(null)\""
I am looking for a way to obtain the canonical format for the platform so that I can use AVAudioConverter to convert the buffer.
Since different platforms have different canonical formats, I imagine there should be some library way of doing this. Otherwise each developer will have to redefine it for each platform the code will run on (OSX, iOS etc) and keep it updated when it changes.
I could not find any constant or function which can make such format, ASDB or settings.
The smartest way I could think of, which does not work:
AudioStreamBasicDescription toDesc;
FillOutASBDForLPCM(toDesc, [AVAudioSession sharedInstance].sampleRate,
2, 16, 16, kAudioFormatFlagIsFloat, kAudioFormatFlagsNativeEndian);
AVAudioFormat *toFormat = [[AVAudioFormat alloc] initWithStreamDescription:&toDesc];
Even the provided example for iPhone, in the documentation linked above, uses kAudioFormatFlagsAudioUnitCanonical and AudioUnitSampleType which are deprecated.
So what is the correct way to do this?
The sample code in the Apple documentation found in PHCloudIdentifier does not compile in xCode 13.2.1.
Can the interface for identifier conversion be clarified so that the answer values are more accessible/readable. The values are 'hidden' inside a Result enum
It was difficult (for me) to rewrite the sample code because I made the mistake of interpreting the Result type as a tuple. Result type is really an enum.
Using the Result type as the return from library.cloudIdentifierMappings(forLocalIdentifiers: ) and .localIdentifierMappings(
for: )
puts the actual mapped identifiers inside the the enum where they need additional access via a .stringValue message or an evaluation of an element of the result enum.
For others finding the same compile issue, here is a working version of the sample code. This compiles in xCode 13.2.1.
func localId2CloudId(localIdentifiers: [String]) -> [String] {
var mappedIdentifiers = [String]()
let library = PHPhotoLibrary.shared()
let iCloudIDs = library.cloudIdentifierMappings(forLocalIdentifiers: localIdentifiers)
for aCloudID in iCloudIDs {
let cloudResult: Result = aCloudID.value
// Result is an enum .. not a tuple
switch cloudResult {
case .success(let success):
let newValue = success.stringValue
mappedIdentifiers.append(newValue)
case .failure(let failure):
// do error notify to user
}
}
return mappedIdentifiers
}
``` swift func
func cloudId2LocalId(assetCloudIdentifiers: [PHCloudIdentifier]) -> [String] {
// patterned error handling per documentation
var localIDs = [String]()
let localIdentifiers: [PHCloudIdentifier: Result<String, Error>] = PHPhotoLibrary.shared() .localIdentifierMappings(
for: assetCloudIdentifiers)
for cloudIdentifier in assetCloudIdentifiers {
guard let identifierMapping = localIdentifiers[cloudIdentifier] else {
print("Failed to find a mapping for \(cloudIdentifier).")
continue
}
switch identifierMapping {
case .success(let success):
localIDs.append(success)
case .failure(let failure) :
let thisError = failure as? PHPhotosError
switch thisError?.code {
case .identifierNotFound:
// Skip the missing or deleted assets.
print("Failed to find the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription)))")
case .multipleIdentifiersFound:
// Prompt the user to resolve the cloud identifier that matched multiple assets.
print("Found multiple local identifiers for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
// if let selectedLocalIdentifier = promptUserForPotentialReplacement(with: thisError.userInfo[PHLocalIdentifiersErrorKey]) {
// localIDs.append(selectedLocalIdentifier)
default:
print("Encountered an unexpected error looking up the local identifier for \(cloudIdentifier). \(String(describing: thisError?.localizedDescription))")
}
}
}
return localIDs
}
I'm using musickit on iOS 15 in my APP. I want to detect user change the player's play state when app enters the background.
I am using AVFoundation for live camera view. I can get my device from the current video input (of type AVCaptureDeviceInput) like:
let device = videoInput.device
The device's active format has a isPortraitEffectSupported. How can I set the Portrait Effect on and off in live camera view?
I setup the camera like this:
private var videoInput: AVCaptureDeviceInput!
private let session = AVCaptureSession()
private(set) var isSessionRunning = false
private var renderingEnabled = true
private let videoDataOutput = AVCaptureVideoDataOutput()
private let photoOutput = AVCapturePhotoOutput()
private(set) var cameraPosition: AVCaptureDevice.Position = .front
func configureSession() {
sessionQueue.async { [weak self] in
guard let strongSelf = self else { return }
if strongSelf.setupResult != .success {
return
}
let defaultVideoDevice: AVCaptureDevice? = strongSelf.videoDeviceDiscoverySession.devices.first(where: {$0.position == strongSelf.cameraPosition})
guard let videoDevice = defaultVideoDevice else {
print("Could not find any video device")
strongSelf.setupResult = .configurationFailed
return
}
do {
strongSelf.videoInput = try AVCaptureDeviceInput(device: videoDevice)
} catch {
print("Could not create video device input: \(error)")
strongSelf.setupResult = .configurationFailed
return
}
strongSelf.session.beginConfiguration()
strongSelf.session.sessionPreset = AVCaptureSession.Preset.photo
// Add a video input.
guard strongSelf.session.canAddInput(strongSelf.videoInput) else {
print("Could not add video device input to the session")
strongSelf.setupResult = .configurationFailed
strongSelf.session.commitConfiguration()
return
}
strongSelf.session.addInput(strongSelf.videoInput)
// Add a video data output
if strongSelf.session.canAddOutput(strongSelf.videoDataOutput) {
strongSelf.session.addOutput(strongSelf.videoDataOutput)
strongSelf.videoDataOutput.videoSettings = [kCVPixelBufferPixelFormatTypeKey as String: Int(kCVPixelFormatType_32BGRA)]
strongSelf.videoDataOutput.setSampleBufferDelegate(self, queue: strongSelf.dataOutputQueue)
} else {
print("Could not add video data output to the session")
strongSelf.setupResult = .configurationFailed
strongSelf.session.commitConfiguration()
return
}
// Add photo output
if strongSelf.session.canAddOutput(strongSelf.photoOutput) {
strongSelf.session.addOutput(strongSelf.photoOutput)
strongSelf.photoOutput.isHighResolutionCaptureEnabled = true
} else {
print("Could not add photo output to the session")
strongSelf.setupResult = .configurationFailed
strongSelf.session.commitConfiguration()
return
}
strongSelf.session.commitConfiguration()
}
}
func prepareSession(completion: @escaping (SessionSetupResult) -> Void) {
sessionQueue.async { [weak self] in
guard let strongSelf = self else { return }
switch strongSelf.setupResult {
case .success:
strongSelf.addObservers()
if strongSelf.photoOutput.isDepthDataDeliverySupported {
strongSelf.photoOutput.isDepthDataDeliveryEnabled = true
}
if let photoOrientation = AVCaptureVideoOrientation(interfaceOrientation: interfaceOrientation) {
if let unwrappedPhotoOutputConnection = strongSelf.photoOutput.connection(with: .video) {
unwrappedPhotoOutputConnection.videoOrientation = photoOrientation
}
}
strongSelf.dataOutputQueue.async {
strongSelf.renderingEnabled = true
}
strongSelf.session.startRunning()
strongSelf.isSessionRunning = strongSelf.session.isRunning
strongSelf.mainQueue.async {
strongSelf.previewView.videoPreviewLayer.session = strongSelf.session
}
completion(strongSelf.setupResult)
default:
completion(strongSelf.setupResult)
}
}
}
Then to I set isPortraitEffectsMatteDeliveryEnabled like this:
func setPortraitAffectActive(_ state: Bool) {
sessionQueue.async { [weak self] in
guard let strongSelf = self else { return }
if strongSelf.photoOutput.isPortraitEffectsMatteDeliverySupported {
strongSelf.photoOutput.isPortraitEffectsMatteDeliveryEnabled = state
}
}
}
However, I don't see any Portrait Effect in the live camera view! Any ideas why?
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.
With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that.
In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.
I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against.
I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like:
filter(matching: \.title, contains: “Abbey Road”)
filter(matching: \.artistName, equalTo: “Between The Buried And Me”)
I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like:
filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)
I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823.
My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.
I've submitted that list bit as FB10185789
Thanks for bearing with my walls of text today. Keep up the great work!