MPNowPlayingInfoPropertyInternationalStandardRecordingCode not working in iOS18 beta 6 + Xcode 16.0 beta.
Reproduce:
“Becoming a now playable app demo” with
1、Info.plist MusicHapticsSupported set YES;
2、song with correct isrc
nowPlayingInfo[MPNowPlayingInfoPropertyInternationalStandardRecordingCode] = metadata.isrc
General
RSS for tagExplore the integration of media technologies within your app. Discuss working with audio, video, camera, and other media functionalities.
Post
Replies
Boosts
Views
Activity
Hello,
I am wondering how one can play music videos (with the actual video playing) with the ApplicationMusicPlayer using MusicKit for Swift?
There is not much documentation on this, so any help would be appreciated.
Hello,
I am trying to make use of SCContentSharingPicker for my app and I wonder how I can detect a close event of SCContentSharingPicker.
I could open the picker screen with following simple code:
SCContentSharingPicker.shared.isActive = true
SCContentSharingPicker.shared.add(self)
SCContentSharingPicker.shared.present()
And I closed it with "Cancel" button located at the top right corner.
Initially I was expecting to get a event through an observer like below but realised that it's called when a stream is canceled.
extension ContentPickerButton: SCContentSharingPickerObserver {
func contentSharingPicker(_ picker: SCContentSharingPicker, didCancelFor stream: SCStream?) {
logger.info("Picker canceled for stream \(stream)")
}
I would like to get a picker close event so that I can deactivate the picker. (Otherwise, camera icon will stay alive at the tray.)
How do we get a close event?
Hello, I noticed that the CADisplayLink seems to emit incorrect targetTimestamp and timestamp in the iOS 18 simulator. If you compute the actual duration of a frame, the duration is always a negative number.
This only occurs in the iOS 18 simulator.
Can We use the Enterprise APP to communicate with the eSIM card of Apple phone through the LPA module or OMA channel? How to connect with LPA or OMA, is it paid?
Having a focused sample Xcode project which demonstrates the issue you are facing is critical to Developer Technical Support engineers being able to assist you. Based on your answers to the questions above, you’ll need to collect additional information before submitting a code-level support request. If you’re unable to provide a sample Xcode project, or are unsure how to proceed, please ask your question in the Apple Developer Forums.
I've uninstalled and reinstalled VLC media player multiple times. However, I receive the following message opening the app after each installation:
Translated Report (Full Report Below)
Process: VLC [71778]
Path: /Applications/VLC2.app/Contents/MacOS/VLC
Identifier: org.videolan.vlc
Version: 3.0.21 (3.0.21)
Code Type: ARM-64 (Native)
Parent Process: launchd [1]
User ID: 501
Date/Time: 2024-08-14 12:27:24.3563 -0400
OS Version: macOS 14.6.1 (23G93)
Report Version: 12
Anonymous UUID: 9DDE1CE7-A635-1165-0FE9-04EA599A542F
Sleep/Wake UUID: E22A843E-7A51-414F-BA7F-AB35B1674915
Time Awake Since Boot: 300000 seconds
Time Since Wake: 267959 seconds
System Integrity Protection: enabled
Crashed Thread: 0 Dispatch queue: com.apple.main-thread
Exception Type: EXC_BAD_ACCESS (SIGSEGV)
Exception Codes: KERN_INVALID_ADDRESS at 0x0000000102af4ae8
Exception Codes: 0x0000000000000001, 0x0000000102af4ae8
Termination Reason: Namespace SIGNAL, Code 11 Segmentation fault: 11
Terminating Process: exc handler [71778]
VM Region Info: 0x102af4ae8 is not in any region. Bytes after previous region: 2793 Bytes before following region: 783640
REGION TYPE START - END [ VSIZE] PRT/MAX SHRMOD REGION DETAIL
__LINKEDIT 102ae8000-102af4000 [ 48K] r--/rwx SM=COW /Applications/VLC2.app/Contents/Frameworks/Breakpad.framework/Versions/A/Resources/breakpadUtilities.dylib
---> GAP OF 0xc0000 BYTES
__TEXT 102bb4000-102c78000 [ 784K] r-x/rwx SM=COW /Applications/VLC2.app/Contents/MacOS/lib/libvlccore.9.dylib
Application Specific Information:
*** multi-threaded process forked ***
crashed on child side of fork pre-exec
Thread 0 Crashed:: Dispatch queue: com.apple.main-thread
0 libsystem_trace.dylib 0x193d8b0c0 _os_log_preferences_refresh + 68
1 libsystem_trace.dylib 0x193d8bb20 os_log_type_enabled + 712
2 CoreFoundation 0x1940da800 _CFBundleCopyPreferredLanguagesInList + 516
3 CoreFoundation 0x1940e75a4 _CFBundleCopyLanguageSearchListInBundle + 124
4 CoreFoundation 0x1940e738c _copyQueryTable + 64
5 CoreFoundation 0x1940e6d5c _copyResourceURLsFromBundle + 376
6 CoreFoundation 0x1940e6118 _CFBundleCopyFindResources + 1400
7 CoreFoundation 0x1940e5b90 CFBundleCopyResourceURL + 56
8 CoreAudio 0x1966c3b58 HALSystem::InitializeShell() + 1412
9 CoreAudio 0x1966c3274 HALSystem::CheckOutInstance() + 192
10 CoreAudio 0x19693360c AudioObjectSetPropertyData_mac_imp + 116
11 libauhal_plugin.dylib 0x10290915c 0x102904000 + 20828
12 VLC 0x1025df4dc 0x1025d8000 + 29916
13 dyld 0x193cb3154 start + 2476
Thread 0 crashed with ARM Thread State (64-bit):
x0: 0x00000001fbe8cfec x1: 0x0000000193da0985 x2: 0x0000000001000104 x3: 0x0000000000000000
x4: 0x0000000193da0937 x5: 0x000000016d826500 x6: 0x0000000000000074 x7: 0x0000000000000000
x8: 0x0000000102af4ae6 x9: 0x00000001fbe97610 x10: 0x0000000000000001 x11: 0x0000000143909730
x12: 0x0000000000000001 x13: 0x000000016d8266f0 x14: 0xaaaaaaaaaaaaaaaa x15: 0x0000000193da01db
x16: 0x0000000193ffd7d4 x17: 0x000000020658e3e0 x18: 0x0000000000000000 x19: 0x0000000143909700
x20: 0x0000000143909700 x21: 0x0000000102af4aea x22: 0x0000000102af4aea x23: 0x0000000143d069f0
x24: 0x0000000143d075a0 x25: 0x0000000000000016 x26: 0x0000000000000000 x27: 0x0000000143d07c60
x28: 0x0000000143d06af0 fp: 0x000000016d826a30 lr: 0x0000000193d8b0a4
sp: 0x000000016d8269e0 pc: 0x0000000193d8b0c0 cpsr: 0x20001000
far: 0x0000000102af4ae8 esr: 0x92000007 (Data Abort) byte read Translation fault
The scenario is quite simple
run an application which uses [SCShareableContent getShareableContentExcludingDesktopWindows] and invoke captureImageWithFilter in completionHandler.
delay invoking captureImageWithFilter for several seconds and switch user session before call it.
The WindowServer crashes if app runs in inactive session.
How to manage this issue correctly? Are there any way to avoid this crash?
ApplicationMusicPlayer is available on the Mac! 🎉🎉🎉 Enormous thanks to @JoeKun and the team. I've already gotten my app up and running through Catalyst, and I've successfully played music! I also got some timeouts, but that was happening on my phone a lot that day too, so maybe my local CDN was just having a bad day.
I wanted to ask this question in a lab this week, but the timing didn't work out: Do you expect the experience to be the same using ApplicationMusicPlayer on a Catalyst vs a macOS target? I'm hoping to reuse much of my iPad app and go the Catalyst route, but I wanted to double check that the new support wasn't just for macOS.
The FxRemoteWindow API in Fxplug 4.3 does not provide window.frame.origin.
How to set a window that can be positioned and sized higher than the motion?
Hello, I have a few apps that I use for screen recording/streaming like OBS as well as capturing screens to project into a VR headset (Immersed), and they use ScreenCaptureKit to record the full displays/all content.
But when capturing that display, some application windows or UI elements, like in Microsoft Teams when you begin a screen sharing session and you get a control-bar overlay to manage sharing options (stop, start, etc), that particular element does not get captured by the recording app's capture (though other MS Teams windows and all the other applications on screen do). Another app that has this problem is the CleanShot X screen capturing app, where it's overlay UI elements don't get captured, but are still on the physical screen. This of course when using Mac displays in VR causes in issue where you can't see these particular CleanShot controls but they are there and intercepting mouse clicks/input.
What would be causing only certain elements to not get captured when the recording app is telling ScreenCaptureKit to not exclude anything, and is there a property on these UI elements that the developer can "opt in" to get SCK to pick them up? I am trying to figure out what feedback to give to developers of programs that have this issue/if it's possible for them to modify their apps to change this behavior?
Thanks!
Hello,
We've a music app reading MPMediaItem.
We got items using MPMediaQuery. But we realized that some downloaded tracks from Apple Music were fetched too. Not all downloaded track but only those who were played recently.
Of course, since these tracks are protected with DRM we can't play them in our player.
It's weird to get them in our query because we added predicate in order to dont fetch protected asset and iCloud item
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyHasProtectedAsset)
MPMediaPropertyPredicate(value: false, forProperty: MPMediaItemPropertyIsCloudItem)
To be sure, we made a second check on each item we've fetched
extension MPMediaItem {
public func isValid() -> Bool {
return self.assetURL != nil && !self.isCloudItem && !self.hasProtectedAsset
}
}
But we still get these items. Their hasProtectedAsset attribute always return false.
I dont know if it's a bug, but since we can't detect this items as Apple Music downloaded track, we can't either:
filter them to not add them in our application library
OR
switch on a MPMusicPlayerController.applicationMusicPlayer to allow the user to play them
I downloaded IOS 18 buThank you very much in Switzerland we have TWINT payment and they don’t work.
During testing, I encountered an issue with SharePlay. Since SharePlay necessitates multi-device testing, I intend to use my Mac and Vision Pro for testing. However, since these two devices are also my primary devices, I am reluctant to switch Apple IDs for testing purposes. Instead, I would like to test the original Apple ID. However, since both devices belong to the same Apple ID and rely on the same phone number, they are unable to FaceTime each other. I am at a loss as to how to proceed.
(BOOL)renderDestinationImage:(FxImageTile *)destinationImage
sourceImages:(NSArray<FxImageTile *> *)sourceImages
pluginState:(NSData *)pluginState
atTime:(CMTime)renderTime
error:(NSError * _Nullable *)outError
{
//。。。。。。其他代码
id sourceTexture = [sourceImages [ 0 ] metalTextureForDevice:[deviceCache deviceWithRegistryID:deviceRegistryID]];
//。。。。。。其他代码
// Clean up
[commandEncoder endEncoding];
[commandBuffer commit];
[commandBuffer waitUntilScheduled];
[colorAttachmentDescriptor release];
[deviceCache returnCommandQueueToCache:commandQueue];
self.texture = [sourceImages [ 0 ] metalTextureForDevice:[deviceCache deviceWithRegistryID:deviceRegistryID]];
return YES;
}
当我用self.texture 渲染到MTKView的时候,在Motion中显示出来非常模糊。
Since updating to iOS 18 my screen time code dose not work and I am unable to give myself more time on apps. I am also unable to modify any of the screen time settings and if I go to settings and click on screen time the settings app will freeze and I need to close the settings window.
I have checked for updates and I have rest phone to default settings. Still not working, help!
It seems that there’s still no way to get all TIFF tags from a TIFF image, is that right? I've got these GeoTIFF images that have a handful of specialized TIFF tags in them. Calling CGImageSourceCopyPropertiesAtIndex(), I can see basic properties common to all TIFF images, like dimensions and color/pixel information, but no others.
Short of including libtiff, is there another way to get at the metadata? I've tried all of the options in CGImageSourceCopyAuxiliaryDataInfoAtIndex.
I've written a few bugs about this since 2020, all ignored.
Under Sonoma 14.4 the compression option doesn't work with PNG images. It works for JPG/HEIF. Preview can export PNG file to HEIC with compression option. What am I missing? Previously this has worked. I am trying with 0.01 and 0.9 as compression quality and the file size is the same for PNG.
Is Preview using some trick to convert the image using ciContext.createCGImage?
PS: Compression option of 1.0 was broken under 14.4 RC and Preview created empty file.
func heifImageDataUsingDestination(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let imageSource = CGImageSourceCreateWithURL(url as CFURL, nil) else { return nil }
guard let cgImage = CGImageSourceCreateImageAtIndex(imageSource, 0, nil) else { return nil }
var mutableData = NSMutableData()
guard let imageDestination = CGImageDestinationCreateWithData(mutableData, "public.heic" as CFString, 1, nil) else { return nil }
let options = [ kCGImageDestinationLossyCompressionQuality: compressionQuality ] as CFDictionary
CGImageDestinationAddImage(imageDestination, cgImage, options)
let success = CGImageDestinationFinalize(imageDestination)
if success {
return mutableData as Data
}
return nil
}
func heifImageDataUsingCIContext(at url: URL, compressionQuality : CGFloat) -> Data? {
guard let ciImage = CIImage(contentsOf: url) else { return nil }
let context = CIContext()
let colorspace = ciImage.colorSpace ?? CGColorSpaceCreateDeviceRGB()
let options = [CIImageRepresentationOption(rawValue: kCGImageDestinationLossyCompressionQuality as String) : compressionQuality]
return context.heifRepresentation(of: ciImage, format: .RGBA8, colorSpace: colorspace, options: options)
}
I'm trying to set a specific start time for the song, using ApplicationMusicPlayer.shared.playbackTime but is not working
musicPlayer.playbackTime = 10
try await musicPlayer.prepareToPlay()
try await musicPlayer.play()
Good morning,
I'm trying to use MusicKit functionalities in order to get last played songs and put them into a local DB, to be played later. Following the guide on developer.apple.com, I created the required AppServices integration:
Below is a minimal working version of what I'm doing:
func requestMusicAuthorization() async {
let status = await MusicAuthorization.request()
switch status {
case .authorized:
isAuthorizedForMusicKit = true
error = nil
case .restricted:
error = NSError(domain: "Music access is restricted", code: -10)
case .notDetermined:
break
case .denied:
error = NSError(domain: "Music access is denied", code: -10)
@unknown default:
break
}
}
on the SwiftUI ContentView there's something like that:
.onAppear {
Task {
await requestMusicAuthorization()
if MusicManager.shared.isAuthorizedForMusicKit {
let response = try await fetchLastSongs()
do {
let request = MusicRecentlyPlayedRequest<Song>()
let response = try await request.response()
var songs: [Song] = response.items.map { $0 }
// do some CloudKit handling with songs...
print("Recent songs: \(songs)")
} catch {
NSLog(error.localizedDescription)
}
}
}
}
Everything seems to works fine, but my console log is full of garbage like that:
MSVEntitlementUtilities - Process MyMusicApp PID[33633] - Group: (null) - Entitlement: com.apple.accounts.appleaccount.fullaccess - Entitled: NO - Error: (null)
Attempted to register account monitor for types client is not authorized to access: {(
"com.apple.account.iTunesStore"
)}
is there something I'm missing on? Should I ignore that and go forward with my implementation? Any help is really appreciated.