Seems like something is wrong with simulator and the Notification Service Extension target. I can't compile for simulator. Xcode 12 beta 6 works fine on my device but when I try to run on simulator I can't compile because of framework search path issues. "Framework not found FirebaseABTesting".
Post
Replies
Boosts
Views
Activity
This is because your deployment targets for all of your apps/extensions don't match. Make sure they either inherit from the project builds settings by highlighting the "iOS Deployment Version" and hitting "delete" so it goes from bold to non-bold. This means that you are now inheriting from the project's build settings.
Not a clear explanation from Apple.
Same problem here. Xcode 14 Beta 6, iOS 16 Beta 8, iPhone 12 mini. captureSession.isMultitaskingCameraAccessSupported is always false. :'(
MPSystemMusicPlayerController.currentPlaybackRate = broken (causes hang) still when starting playback. iOS 16.4.1(a). :(
I had this issue too. I had two interrelated dependencies. One as a CocoaPod and one as a Swift Package. Only one was available as a SPM (don't ask me why lol). I moved both to use as CocoaPods and the issue went away. Out of my wheelhouse to know exactly why. Sorry.
I'm getting a similar issue when I trying to add a track to one of my playlists.
Failed to perform MusicDataRequest.Context(
url: "https://api.music.apple.com/v1/me/library/playlists/p.8Wx63rZTVm1RAxY/tracks",
currentRetryCounts: [.other: 1]
) with MusicDataRequest.Error(
status: 500,
code: 50001,
title: "Upstream Service Error",
detailText: "Unable to update tracks",
id: "LE36LDFW4BHXYHU3GGQPSF563Y",
originalResponse: MusicDataResponse(
data: 146 bytes,
urlResponse: <NSHTTPURLResponse: 0x0000000282a63640>
)
).
How do we know if the device supports Lossless Audio from Settings -> Music -> Audio Quality -> Lossless. Is there a flag in an Apple SDK that can tell us and based on that we can show the Lossless glyph or not.