I’m a tracking company and have my own tracking platform. Looking for the solution that using tag device for animals like Air Tags but running on my platform.
Is there a way to allow my platform to interface with the Find My Phone to get the location data of my Tags ?
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hey there, I'm trying to display all user's albums using the MediaPlayer library. I'm getting many albums returning nil, but I know artwork exists because they show up in the default Music app. There doesn't seem to be much rhyme or reason for what shows up and what doesn't. All downloaded albums display artwork, but some cloud album artwork displays as well. Here's the code I'm using to debug this.
let query = MPMediaQuery.albums()
if let albumCollections = query.collections {
albums = albumCollections
}
for album in albums {
let artwork = album.representativeItem?.artwork
print(artwork, artwork?.image(at: CGSize(width: 100, height: 100)))
}
Any help would be greatly appreciated. Thanks!
We develop a video playback app on Apple TV which has the two following features:
Its content browsing screen has installed a gesture recognizer for presses on the PlayPause Siri remote button in order to directly launch a playback. The gesture recognizer is attached to the content browsing UIViewController view.
It presents its own custom playback UI with an AVPlayerLayer for the video and supports MPNowPlayingSession in order to publish current playback information and respond to remote commands. It also supports switching between fullscreen and Picture in Picture playback.
Both features work fine, ie. the playback is launched when pressing the PlayPause Siri remote button and, during playback, the playback info are properly advertised on other devices and remote commands are also triggered as expected.
However, when pressing the PlayPause Siri remote button while the video is playing in PiP, the "pause" remote command is sometimes triggered instead of the .playPause gesture recognizer. The issue may not occur the first time but for subsequent PlayPause presses. Navigating a bit in the app UI seems to help preventing the issue to occur.
Finally, the issue only occurs if the video is playing. If the video is paused, the PlayPause Siri remote button gesture is always recognized instead of the remote command.
Please note that, before using MPNowPlayingSession (and the corresponding MPRemoteCommandCenter), the app was using the default MPRemoteCommandCenter to support remote commands and the issue did not occur.
We don't reproduce this issue with the Apple TV app so there's probably something we are not doing right. Has someone any clue?
My app reports a lot of crashes from 18.2 users.
I have been able to narrow down the issue to this line of code:
CGImageDestinationFinalize(imageDestination)
The error is Thread 93: EXC_BAD_ACCESS (code=1, address=0x146318000)
But I have no idea why this suddently started to crash.
Here is the code of the function:
private func estimateSizeUsingThumbnailMethod(fromImageURL url: URL, imageSettings: ImageSettings) -> (Int, Int) {
let sourceOptions = [kCGImageSourceShouldCache: false] as CFDictionary
guard let source = CGImageSourceCreateWithURL(url as CFURL, sourceOptions),
let imageProperties = CGImageSourceCopyPropertiesAtIndex(source, 0, nil) as? [CFString: Any],
let imageWidth = imageProperties[kCGImagePropertyPixelWidth] as? CGFloat,
let imageHeight = imageProperties[kCGImagePropertyPixelHeight] as? CGFloat else {
return (0, 0)
}
let maxImageSize = max(imageWidth, imageHeight)
let thumbMaxSize = min(2400, maxImageSize) // Use original size if possible, but not if larger than 2400, in this case we'll extrapolate from thumbnail
let downsampleOptions = [
kCGImageSourceCreateThumbnailFromImageAlways: true,
kCGImageSourceCreateThumbnailWithTransform: true,
kCGImageSourceThumbnailMaxPixelSize: thumbMaxSize as CFNumber,
] as CFDictionary
guard let cgImage = CGImageSourceCreateThumbnailAtIndex(source, 0, downsampleOptions) else {
DLog("CGImage thumb creation error")
return (0, 0)
}
let data = NSMutableData()
guard let imageDestination = CGImageDestinationCreateWithData(data, UTType.jpeg.identifier as CFString, 1, nil) else {
DLog("CGImage destination creation error")
return (0, 0)
}
let destinationProperties = [
kCGImageDestinationLossyCompressionQuality: imageSettings.quality.compressionRatio() // Set jpeg compression ratio
] as CFDictionary
CGImageDestinationAddImage(imageDestination, cgImage, destinationProperties)
CGImageDestinationFinalize(imageDestination) // <----- CRASHES HERE with EXC_BAD_ACCESS
...
}
So far, I'm stuck. Any idea that could help would be greatly appreciated, as I'm scared that this crash will propagate on the official release of 18.2
I have an app that gets data from Music.app with both the iTunesLibrary and MusicKit.
iTunesLibrary has ITLibArtist.sortName and ITLibAlbum.sortTitle and ITLibAlbum.sortAlbumArtist.
I can’t seem to find an equivalent in MusicKit. How are those properties obtained using MusicKit? Thanks.
FYI I have filed FB15554956 on this. You also may see my code at https://github.com/bolsinga/itunes_json
I use the iTunes Library framework in one of my apps, starting with macOS Sequoia 15.1 i can't create the ITLibrary object anymore with the following error:
Connection to amplibraryd was interrupted. clientName:iTunesLibrary(ITLibraryLoader)
Error connecting to the server via proxy object Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.amp.library.framework" UserInfo={NSDebugDescription=connection to service named com.apple.amp.library.framework}
configure failed: Error Domain=NSCocoaErrorDomain Code=4097 "connection to service named com.apple.amp.library.framework" UserInfo={NSDebugDescription=connection to service named com.apple.amp.library.framework}
I created a new sandboxed macOS app, added the music folder read permission and it reproduced the error:
import SwiftUI
import iTunesLibrary
@main
struct ITLibraryLoaderApp: App {
var body: some Scene {
WindowGroup {
ContentView()
}
}
}
struct ContentView: View {
var body: some View {
Button("Load ITLibrary") {
loadLibrary()
}
}
func loadLibrary() {
do {
let _ = try ITLibrary(apiVersion: "1.0", options: .none)
}
catch {
print(error)
}
}
}
I restarted my developer machine and the music app with no luck.
https://api.media.apple.com/v1/feed/exports/song_2024-11-02T16-02/parts?limit=200&offset=400
This is the api used to get parquet file urls. I need all the urls in one api hit, right now if I don't provide the limit then default it is taking 100 and max is 200.
How to get all the records in one hit? Or the count of parquet records in one hit?
Hi,
I am in need to get the total number of parquet files that are present in the apple music feed api for songs, artists. As there is option for limit and offset. But limit is limited to 200 records and offset is uncertain.
How to get total number of parquet files number without quering apple music feed api mulitple times?
Need help regarding this. Thanks!
I'm using iCloud Music Library. I’m using macOS 14.1 (23B74) and iOS 17.1.
i’m using MusicKit to find songs that do not have artwork. On iOS, Song.artwork will be nil for items I know do not have artwork. On macOS, Song.artwork is not nil. However when the songs are shown in Music.app, they do not have Artwork. Is this expected? Alternately, is there a more correct way to determine that a Song has no Artwork?
I have also filed FB13315721.
Thank you for any tips!
I am attempting to do batch Transcription of audio files exported from Voice Memos, and I am running into an interesting issue. If I only transcribe a single file it works every time, but if I try to batch it, only the last one works, and the others fail with No speech detected. I assumed it must be something about concurrency, so I implemented what I think should remove any chance of transcriptions running in parallel. And with a mocked up unit of work, everything looked good. So I added the transcription back in, and
1: It still fails on all but the last file. This happens if I am processing 10 files or just 2.
2: It no longer processes in order, any file can be the last one that succeeds. And it seems to not be related to file size. I have had paragraph sized notes finish last, but also a single short sentence that finishes last.
I left the mocked processFiles() for reference.
Any insights would be greatly appreciated.
import Speech
import SwiftUI
struct ContentView: View {
@State private var processing: Bool = false
@State private var fileNumber: String?
@State private var fileName: String?
@State private var files: [URL] = []
let locale = Locale(identifier: "en-US")
let recognizer: SFSpeechRecognizer?
init() {
self.recognizer = SFSpeechRecognizer(locale: self.locale)
}
var body: some View {
VStack {
if files.count > 0 {
ZStack {
ProgressView()
Text(fileNumber ?? "-")
.bold()
}
Text(fileName ?? "-")
} else {
Image(systemName: "folder.badge.minus")
Text("No audio files found")
}
}
.onAppear {
files = getFiles()
Task {
await processFiles()
}
}
}
private func getFiles() -> [URL] {
do {
let documentsURL = FileManager.default.urls(for: .documentDirectory, in: .userDomainMask).first!
let path = documentsURL.appendingPathComponent("Voice Memos").absoluteURL
let contents = try FileManager.default.contentsOfDirectory(at: path, includingPropertiesForKeys: nil, options: [])
let files = (contents.filter {$0.pathExtension == "m4a"}).sorted { url1, url2 in
url1.path < url2.path
}
return files
}
catch {
print(error.localizedDescription)
return []
}
}
private func processFiles() async {
var fileCount = files.count
for file in files {
fileNumber = String(fileCount)
fileName = file.lastPathComponent
await processFile(file)
fileCount -= 1
}
}
// private func processFile(_ url: URL) async {
// let seconds = Double.random(in: 2.0...10.0)
// await withCheckedContinuation { continuation in
// DispatchQueue.main.asyncAfter(deadline: .now() + seconds) {
// continuation.resume()
// print("\(url.lastPathComponent) \(seconds)")
// }
// }
// }
private func processFile(_ url: URL) async {
let recognitionRequest = SFSpeechURLRecognitionRequest(url: url)
recognitionRequest.requiresOnDeviceRecognition = false
recognitionRequest.shouldReportPartialResults = false
await withCheckedContinuation { continuation in
recognizer?.recognitionTask(with: recognitionRequest) { (transcriptionResult, error) in
guard transcriptionResult != nil else {
print("\(url.lastPathComponent.uppercased())")
print(error?.localizedDescription ?? "")
return
}
if ((transcriptionResult?.isFinal) == true) {
if let finalText: String = transcriptionResult?.bestTranscription.formattedString {
print("\(url.lastPathComponent.uppercased())")
print(finalText)
}
}
}
continuation.resume()
}
}
}
I'm very excited about the new MusicLibrary API, but after a couple of days of playing around with it, I have to say that I find the implementation of filtering MusicLibraryRequests a little confusing. MPMediaQuery has a fairly extensive list of predicates that can be applied, including string and persistentID comparisons for artist, album artist genre, and more. It also lets you filter on an item’s title. MusicLibraryRequests let you filter on the item’s ID, or on its MusicKit Artist and Genre relationships. To me, this seems like it adds an extra step.
With an MPMediaQuery, if I wanted to fetch every album by a given artist, I’d apply an MPMediaPropertyPredicate looking at MPMediaItemPropertyAlbumArtist and compare the string. It was also easy to change the MPMediaPredicateComparison to .contains to match more widely. If I wanted to surface albums by “Aesop Rock” or “Aesop Rock & Blockhead,” I could use that.
In the MusicLibraryRequest implementation, it looks like I need to perform a MusicLibraryRequest<Artist> first in order to get the Artist objects. There’s no filter for the name property, so if I don’t have their IDs, I’ve got to use filter(text:). From there, I can take the results of that request and apply them to my MusicLibraryRequest<Album> using the filter(matching:memberOf) function.
I could use filter(text:) on the MusicLibraryRequest<Album>, but that filters across multiple properties (title and artistName?) and is less precise than defining the actual property I want to match against.
I think my ideal version of the MusicLibraryRequest API would offer something like filter(matching:equalTo:) or filter(matching:contains:) that worked off of KeyPaths rather than relationships. That seems more intuitive to me. I’m not saying we need every property from every filterable MPMediaItemProperty key, but I’d love to be able to do it on title, artistName, and other common metadata. That might look something like:
filter(matching: \.title, contains: “Abbey Road”)
filter(matching: \.artistName, equalTo: “Between The Buried And Me”)
I noticed that filter(text:) is case insensitive, which is awesome, and something I’ve wanted for a long time in MPMediaPropertyPredicate. As a bonus, it would be great if a KeyPath based filter API supported a case sensitivity flag. This is less of a problem when dealing with Apple Music catalog content, but users’ libraries are a harsh environment, and you might have an artist “Between The Buried And Me” and one called “Between the Buried and Me.” It would be great to get albums from both with something like:
filter(matching: \.artistName, equalTo: “Between The Buried And Me”, caseSensitive: false)
I've submitted the above as FB10185685. I also submitted another feedback this morning regarding filter(text:) and repeating text as FB10184823.
My last wishlist item for this API (for the time being!) is exposing the MPMediaItemPropertyAlbumPersistentID as an available filter attribute. I know, I know… hear me out. If you take a look at the other thread I made today, you’ll see that due to missing metadata in MusicKit, I still have some use cases where I need to be able to reference an MPMediaItem and might need to fetch its containing MPMediaItemCollection to get at other tracks on the album. It would be nice to seamlessly be able to fetch the MPMediaItemCollection or the library Album using a shared identifier, especially when it comes to being able to play the album in MusicKit’s player rather than Media Player’s.
I've submitted that list bit as FB10185789
Thanks for bearing with my walls of text today. Keep up the great work!
I'm creating app that listening other app's sound. in this use case, screen data is not needed.
but if I don't call SCStream#addStreamOutput(_, type: .screen, ...), console shows this error:
[ERROR] _SCStream_RemoteVideoQueueOperationHandlerWithError:701 stream output NOT found. Dropping frame
currently I'm setting SCStreamConfiguration#minimumFrameInterval to large value (e.g. 0.1fps) as workaround, but it would be good if i can completely disable screen capture for best performance.
there is any way to disable screen capture and only captures apps audio?
So I have been using the iOS 18 beta and I would like to say first and foremost really great update I love the timed messages and the car Sickness stuff that’s awesome but my only problem with the update is screen time. I have screen time on my phone and when I updated to iOS 18 beta I can’t seem to claim more screen time. I have my screen time to where I can ignore it once I run out But in the new update I can seem press the button all I want but It wont give me more screen time. This Is what I imagine a minor fix and I would really appreciate if this was fixed soon. But other than that small detail Amazing update love what your doing over here at apple!
I am calling AVSampleBufferDisplayLayer.flush from a background queue but this seems to occasionally crash the app. I am calling it from the same thread I pass to - (void)requestMediaDataWhenReadyOnQueue:(dispatch_queue_t)queue usingBlock:(void (^)(void))block;.
My question is, is this API threadsafe, or do I need to call flush from the main thread? Or is there another issue that I am not considering? It seems strange to me that this API would trigger an autolayout pass.
0 CoreFoundation 0x00000001bb384e38 __exceptionPreprocess + 164
1 libobjc.A.dylib 0x00000001b451b8d8 objc_exception_throw + 59
2 CoreAutoLayout 0x00000001d7e09e84 _AssertAutoLayoutOnAllowedThreadsOnly + 327
3 CoreAutoLayout 0x00000001d7e00e60 -[NSISEngine withBehaviors:performModifications:] + 35
4 UIKitCore 0x00000001be58fd40 -[UIView _postMovedFromSuperview:] + 671
5 UIKitCore 0x00000001bd56dfec -[UIView(Internal) _addSubview:positioned:relativeTo:] + 1903
6 UIKitCore 0x00000001bda57ccc -[_UITextLayoutCanvasView textViewportLayoutController:configureRenderingSurfaceForTextLayoutFragment:] + 455
7 UIFoundation 0x00000001c588bc9c __48-[NSTextViewportLayoutController layoutViewport]_block_invoke_4 + 151
8 UIFoundation 0x00000001c5836b50 __80-[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:]_block_invoke + 43
9 UIFoundation 0x00000001c580e158 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke_2 + 535
10 CoreFoundation 0x00000001bb385350 __NSARRAY_IS_CALLING_OUT_TO_A_BLOCK__ + 23
11 CoreFoundation 0x00000001bb3b24dc -[__NSSingleObjectArrayI enumerateObjectsWithOptions:usingBlock:] + 91
12 UIFoundation 0x00000001c580de28 __83-[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:]_block_invoke + 775
13 UIFoundation 0x00000001c57f7504 -[NSTextLayoutManager enumerateTextLayoutFragmentsFromLocation:options:usingBlock:] + 659
14 UIFoundation 0x00000001c57f7264 -[NSTextLayoutManager enumerateViewportElementsFromLocation:options:usingBlock:] + 99
15 UIFoundation 0x00000001c57f6d7c -[NSTextViewportLayoutController layoutViewport] + 1299
16 UIKitCore 0x00000001bd580a3c +[UIView(Animation) performWithoutAnimation:] + 75
17 UIKitCore 0x00000001bd5582d0 -[_UITextLayoutCanvasView layoutSubviews] + 139
18 UIKitCore 0x00000001bd5544c8 -[UIView(CALayerDelegate) layoutSublayersOfLayer:] + 1979
19 QuartzCore 0x00000001bca277fc CA::Layer::layout_if_needed(CA::Transaction*) + 499
20 QuartzCore 0x00000001bca3aeb0 CA::Layer::layout_and_display_if_needed(CA::Transaction*) + 147
21 QuartzCore 0x00000001bca4c234 CA::Context::commit_transaction(CA::Transaction*, double, double*) + 443
22 QuartzCore 0x00000001bca81630 CA::Transaction::commit() + 651
23 MediaToolbox 0x00000001ca8d0da0 videoQueueRemote_SetProperty + 367
24 AVFCore 0x00000001cad191b4 __63-[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:]_block_invoke + 179
25 libdispatch.dylib 0x00000001c299cf88 _dispatch_client_callout + 19
26 libdispatch.dylib 0x00000001c29ac574 _dispatch_lane_barrier_sync_invoke_and_complete + 55
27 AVFCore 0x00000001cad190d0 -[AVSampleBufferVideoRenderer _setContentLayerOnFigVideoQueue:] + 167
28 AVFCore 0x00000001cad14674 -[AVSampleBufferVideoRenderer _createVideoQueue:errorStep:] + 195
29 AVFCore 0x00000001cad14ac8 -[AVSampleBufferVideoRenderer createVideoQueue:] + 55
30 AVFCore 0x00000001cad179cc -[AVSampleBufferVideoRenderer flushWithRemovalOfDisplayedImage:completionHandler:] + 439
31 App 0x0000000102370214 -[AVSampleBufferDisplayLayer flush] + 51
Hello,
I have a command line application that uses iTunesLibrary to "save" the state of what I have listened to. I have it run every night via a LaunchAgent. You can see the source here: https://github.com/bolsinga/itunes_json
Prior to Sequoia it would run nightly. I'd just have to grant it access to the Music library once, and it would be fine thereafter. However with Sequoia it requires UI interaction to grant it access every time. This makes it no longer run unattended overnight, defeating its purpose.
I have the console logs of when this happens. You can see it in my issue tracking it here: https://github.com/bolsinga/itunes_json/issues/410
One thing that makes me wonder is that it is a command line application, not a bundle. How do I make a command line application get access to MusicKit / iTunesLibrary, and keep it thereafter? I'd like to get my pre-Sequoia behavior back. I've filed FB15592660 too.
I've granted it access to run in the background, as well as access to my Music library (please see attached screenshots).
AMPLibraryAgent 10:48:29.489944-0700 xpc Connection from framework client invalidated pid:57606 clientname:iTunesLibrary(itunes_json)
AMPLibraryAgent 10:48:29.492763-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1229 previous open:15 new open:1
itunes_json 10:48:59.980864-0700 connection [0x157f05800] activating connection: mach=true listener=false peer=false name=com.apple.amp.library.framework
tccd 10:48:59.982568-0700 access AUTHREQ_ATTRIBUTION: msgID=1795.214, attribution={accessing={TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}, requesting={TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent}, },
tccd 10:48:59.982651-0700 access requestor: TCCDProcess: identifier=com.apple.AMPLibraryAgent, pid=1795, auid=501, euid=501, binary_path=/System/Library/PrivateFrameworks/AMPLibrary.framework/Versions/A/Support/AMPLibraryAgent is checking access for accessor TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json
tccd 10:48:59.995636-0700 access AUTHREQ_SUBJECT: msgID=1795.214, subject=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json,
tccd 10:48:59.996283-0700 access -[TCCDAccessIdentity staticCode]: static code for: identifier /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json, type: 1: 0xc00341b00 at /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json
tccd 10:49:00.018205-0700 access Failed to match existing code requirement for subject /Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json and service kTCCServiceMediaLibrary
cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31"
cdhash H"6bc380972f4df49b337a2a05308fb7b98fbe6473" or cdhash H"0708bcaabbfbab8770522050f7e2642d4d864f31"
tccd 10:49:00.018997-0700 access AUTHREQ_PROMPTING: msgID=1795.214, service=kTCCServiceMediaLibrary, subject=Sub:{/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json}Resp:{TCCDProcess: identifier=itunes_json, pid=57652, auid=501, euid=501, binary_path=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json},
AMPLibraryAgent 10:49:02.489170-0700 xpc ampld> register framework ClientName:iTunesLibrary(itunes_json)
tccd 10:49:02.488189-0700 events Publishing <TCCDEvent: type=Create, service=kTCCServiceMediaLibrary, identifier_type=Path, identifier=/Users/bolsinga/Applications/itunes_json/Products/usr/local/bin/itunes_json> to 4 subscribers: {
633 = "<TCCDEventSubscriber: token=633, state=Initial, csid=(null)>";
628 = "<TCCDEventSubscriber: token=628, state=Passed, csid=com.apple.chronod>";
464 = "<TCCDEventSubscriber: token=464, state=Passed, csid=com.apple.cloudd>";
513 = "<TCCDEventSubscriber: token=513, state=Passed, csid=com.apple.photolibraryd>";
}
AMPLibraryAgent 10:49:02.490391-0700 xpc ampld> registered framework ClientName:iTunesLibrary(itunes_json) with clientID:1230
itunes_json 10:49:02.792084-0700 connection [0x147e04340] activating connection: mach=true listener=false peer=false name=com.apple.amp.artworkd
itunes_json 10:49:02.801482-0700 <Missing Description> openDatabase 0xe4af30f4493e5ef5 artwork folder Y '<private>'
itunes_json 10:49:02.805087-0700 <Missing Description> openDatabase 0xf2db6e8d7672edc9 artwork folder Y '<private>'
itunes_json 10:49:02.806736-0700 <Missing Description> openDatabase 0xfb2acd898c951851 artwork folder Y '<private>'
itunes_json 10:49:02.813286-0700 <Missing Description> openDatabase 0xf0f4919c5ff0e88 artwork folder Y '<private>'
itunes_json 10:49:09.634928-0700 connection [0x600002b6a0d0] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.daemon
itunes_json 10:49:09.635019-0700 connection [0x600002b78000] activating connection: mach=true listener=false peer=false name=com.apple.cfprefsd.agent
AMPLibraryAgent 10:49:12.382878-0700 xpc Connection from framework client invalidated pid:57652 clientname:iTunesLibrary(itunes_json)
AMPLibraryAgent 10:49:12.383474-0700 service Unloading domains(14) for ClientID:iTunesLibrary(itunes_json)-1230 previous open:15 new open:1
itunes_json.log
I am building an app for MacOS and I am trying to implement the code to add songs to a library playlist (which is added below). The issue I am having is that if I use Music Kit to load a users library playlists, the ID for the playlist (which is just a string of numbers) does not work with the Add tracks to a Library Playlist endpoint of Apple Music API. If I retrieve the playlists from the Apple Music API and use that playlist ID (which is different than the id I get from MusicKit) my code works fine and adds the song to the playlist. The problem is that when getting a users library playlists from Apple Music API is that it does not give me all of the library playlists that I get when using Music Kit and it also does not give me Artwork for playlists that have the collage of album covers, so I would prefer to use Music Kit to get the playlists.
I have also tested trying to retrieve a single playlist using the Apple Music API with the playlist Id from Music Kit and it does not work. I get the error that the resource cannot be found. Since this is a macOs app I cannot use MusicKit to add songs to library playlists.
Does anyone know a way to resolve this? Or a possible workaround? Ideally I want to use MusicKit to get the library playlists and have some way to use the playlist Id and add songs to that playlist. Below is my code for adding a song to a playlist using the Apple Music API, which works correctly only if I originally get the library playlist's id value from a playlist retrieved from the Apple Music API.
Also, does anyone know why the playlist Id's are not universal and are different when using Music Kit and Apple Music API? For songs and tracks it does not seem to matter if I use music kit or Apple Music API, the Id's are in the correct format for Apple Music API to use and work with my code. Thanks everyone for any and all help!
func addToPlaylist(songs: [Track], playlist: Playlist, alert: Binding<AlertItem?>) async {
let tracks = AppleMusicPlaylistPostRequestBody(data: songs.compactMap {
AppleMusicPlaylistPostRequestItem(id: $0.id.rawValue, type: "songs") // or "library-songs"
})
let playlistID = playlist.id
// Build the request URL for adding a song to a playlist
guard let url = URL(string: "https://api.music.apple.com/v1/me/library/playlists/\(playlistID)/tracks") else {
alert.wrappedValue = AlertItem(title: "Error", message: "Invalid URL for the playlist.")
return
}
// Authorization Header
guard let musicUserToken = try? await MusicUserTokenProvider().getUserMusicToken() else {
alert.wrappedValue = AlertItem(title: "Error", message: "Unable to retrieve Music User Token.")
return
}
do {
var request = URLRequest(url: url)
request.httpMethod = "POST"
request.setValue("Bearer \(musicUserToken)", forHTTPHeaderField: "Authorization")
request.setValue("application/json", forHTTPHeaderField: "Content-Type")
let encoder = JSONEncoder()
let data = try encoder.encode(tracks)
request.httpBody = data
let musicRequest = MusicDataRequest(urlRequest: request)
let musicRequestResponse = try await musicRequest.response()
// Check if the request was successful (status 201)
if musicRequestResponse.urlResponse.statusCode == 201 {
alert.wrappedValue = AlertItem(title: "Success", message: "Song successfully added to the playlist.")
} else {
print("Status Code: \(musicRequestResponse.urlResponse.statusCode)")
print("Response Data: \(String(data: musicRequestResponse.data, encoding: .utf8) ?? "No Data")")
// Attempt to decode the error response into the AppleMusicErrorResponse model
if let appleMusicError = try? JSONDecoder().decode(AppleMusicErrorResponse.self, from: musicRequestResponse.data) {
let errorMessage = appleMusicError.errors.first?.detail ?? "Unknown error occurred."
alert.wrappedValue = AlertItem(title: "Error", message: errorMessage)
} else {
alert.wrappedValue = AlertItem(title: "Error", message: "Failed to add song to the playlist.")
}
}
} catch {
alert.wrappedValue = AlertItem(title: "Error", message: "Network error: \(error.localizedDescription)")
}
}
Having an issue playing songs fetched with MusicKit but only on release builds, ApplicationMusicPlayer.shared.prepareToPlay() throws
"The operation couldn’t be completed. (MPMusicPlayerControllerErrorDomain error 6.)"
The authorization status is .authorized
The debug version of the app works perfectly.
Here is my project for the reference.
Thank you.
macOS 15.0.1, Xcode 16.0
ColorSync throughout the system interprets the Rec. ITU-R BT.709-5 color profile differently in macOS 15 than it did on macOS 14 leading to severe color differences. This is breaking/problem-causing color change.
Here's a comparison of one way the results can appear different.
Steps to Reproduce
Open this image above (named here as sRGB_Bars.png) in ColorSync Utility. (This should be an sRGB profiled image if the forums don't mess with that.)
With Digital Color Meter open and set to "Display in sRGB" [0-255] you can see the gray bars progress left-to-right as 0, 23, 46, 69, 92, 115, etc... ] Those are the correct reference values.
At the bottom of the window in ColorSync use [Match to Profile] [Display -> Rec. ITU-R BT.709-5], then click the Apply Button. You can verify the image now uses the Rec 709 profile with the (i) Get Info button in the toolbar.
Use "Save As…" to save the image with a different name.
**) Do the steps above on macOS 14 and macOS 15 to create two images. Rec709_CreatedOnMacOS14, and Rec709_CreatedOnMacOS15.
**) Compare the two images on BOTH operating system versions, and you'll see they are significantly different from each other.
Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of:
[0, 1, 18, 43, 67, ...]
Rec709_CreatedOnMacOS14.png when viewed on macOS 15 has the gray values of:
[0, 51, 72, 94, 115, ... ]
These are MASSIVE differences.
Significance
This is not just a problem that affects ColorSync Utility or Preview, etc. This same gamma interpretation difference is affecting the Core Video, Core Image, Video Toolbox, etc pipelines as well. That gets complicated to talk about, and this is example is the simplest I can boil it down to.
Conclusion
Significant bug? What's going on?
My iphone 15 plus suddenly turns black and a losing icon keeps spinning. Then it turns off and I can use it again, it is only for a few seconds.
I have updated to iOS 18.1 beta, could this be the issue. Is my phone broken?
I have tried restarting my phone
Hi,
I'm working on an integration with the Apple Music Feed, but over the last day, I've been getting a 500 Upstream Service Error on 99% of the API calls I make.
Remarkably, retrying the same endpoint 20-30 times sometimes gives the correct response, but mostly it's a 500 error.
Just to give an example, this:
GET https://api.media.apple.com/v1/feed/album/latest
Returns this generic error:
{
"errors": [
{
"id": "U4ARRA2QDCGYKYRI2IPEJVBTHY",
"title": "Upstream Service Error",
"detail": "Call to get metadata for album feed failed",
"status": "500",
"code": "50001"
}
]
}
The same goes for other feeds like song, artist, and so on.
Before today, I did get the same error message sometimes, but a few retries would solve the issue.
Any insight on what's happening and/or an ETA on fixing it?
Thank you