Integrate photo, audio, and video content into your apps.

Posts under Media tag

73 Posts
Sort by:

Post

Replies

Boosts

Views

Activity

Anyone has any issues with voice control and high quality visuals in the Vision Pro
We are working on an app for the vision pro which has high polygons count and lots of high resolution textures. Everything looks smooth and in general very well, The issue is the moment we turn on voice control even if it is not being used the visuals at the center start to stutter left to right. Has anyone seen this ?, it must be a bug, any workaround ? Thanks, Guillermo
1
0
267
Oct ’24
iPad native camera vs AVFoundation device
We are trying to build a video recording app using AVFoundation and AVCaptureDevice. No custom settings are used like iso, exposure duration. All the settings are kept to auto. But when video is captured using front camera and 1080x1920 dimensions, the video captured from the app and front native camera does not match. In settings i have kept video setting as 30 fps 1080x1920. The video captured from the app includes more area than the native app. Also some values like iso, exposure duration does not match. So is there any way to capture video exactly same as native camera using AVFoundation and AVCaptureDevice. I have attached screenshots from video for reference. Native AVCapture
0
0
243
Sep ’24
Alternative for crashing API MPMediaItemArtwork
When setting the now playing info for playing media in MPNowPlayingInfoCenter we can set artwork. But it seems the Apple API for creating the artwork is crashing on iOS 18 (FB15145734). On iOS 17 this gave the warning that the completion handler was not run on the main thread. I've tried to seek help here: https://stackoverflow.com/questions/78989543/swift-data-race-with-appkit-mpmediaitemartwork-function/78990231?noredirect=1#comment139277425_78990231 but it seems that it's not possible to override the completion handler and therefor it's up to Apple to fix this issue. .task { await MainActor.run { let nowPlayingInfoCenter = MPNowPlayingInfoCenter.default() var nowPlayingInfo = [String: Any]() let image = NSImage(named: "image")! // warning: data race detected: @MainActor function at MPMediaItemArtwork/ContentView.swift:22 was not called on the main thread nowPlayingInfo[MPMediaItemPropertyArtwork] = MPMediaItemArtwork(boundsSize: image.size, requestHandler: { _ in // Not on main thread here! return image }) nowPlayingInfoCenter.nowPlayingInfo = nowPlayingInfo } } I'm wondering if there is an alternative method to set the now playing artwork?
3
0
354
3w
Why isReadyForMoreMediaData is false sometimes?
Hi, Recording Videos with AVAssetWriter, capture fps(camera output fps) is ok, but final result video fps was lower, the reason is AVAssetWriterInput.isReadyForMoreMediaData is false sometimes. Yes, I have read document many times, it said need to set expectsMediaDataInRealTime to true and balabala... I really be tortured by this problem for a long time, can I debug this problem? or any advice?
2
0
317
Sep ’24
Detect Dolby Atmos programmatically
Hi, I am trying to detect if an audio stream is Dolby Atmos. I have existing code that determines if a stream is Dolby Atmos based on the following: Channel count is greater than equal to 8 Binaural is true Immersive is true Downmix is false I am trying to determine if these rules are correct and documentation that specifies these rules that I can reference in the future. Any help you can provide is greatly appreciated. Regards, John
1
0
472
Sep ’24
Can not playback downloaded encrypted HLS fmp4 on iPhone
Hi all, we want to play downloaded encrypted HLS Fragment MP4 files on iPhone, and we are using UsingAVFoundationToPlayAndPersistHTTPLiveStreams to test. on this HLSCatalog app, we can playback encrypted HLS Fragment MP4 streaming, but when we download the encrypted HLS Fragment MP4 to device/iPhone, then try to playback on device, but now, it has an issue:The error is: Error: Optional("The operation couldn’t be completed. (CoreMediaErrorDomain error 1718449215.)") So, we want to know how can I playback downloaded encrypted HLS fmp4 on iPhone. and you can try below url: http://69.234.244.220/prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted/prog_index.m3u8 Steps to reproduce: 1: create a HLS fragment mp4 with mediafilesegmenter: mediafilesegmenter --iso-fragmented -t 4 --encrypt-key-file=BT709-2D-48FPS.key --encrypt-key-url=http://69.234.244.220:5000/download/BT709-2D-48FPS.key -f prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted HDR10_2D_LEFT_48FPS.mp4 2: upload to content server 3: download UsingAVFoundationToPlayAndPersistHTTPLiveStreams from https://developer.apple.com/documentation/avfoundation/offline_playback_and_storage/using_avfoundation_to_play_and_persist_http_live_streams 4: in HLSCatalog app, replace playlist_url of Item-1 of Streams to http://69.234.244.220/prod/vod/HDR10_2D_LEFT_48FPS_FMP4_Encrypted/prog_index.m3u8 5: in HLSCatalog app->click the icon of Advanced Stream-->click download, when download success, the try play...now, it can NOT playback on iPhone.
1
0
320
Sep ’24
Disabling / Detecting iOS Screen Mirroring for Apps API or Solution?
Hello Apple, I am yet again concerned about the new iOS Screen Mirroring that going to be available on iOS 18 stable. I have an app that is only meant to be viewed on iPhones (not Macs or Computers, due to various security reasons. I have raised a Feedback Assistant on this and Apple have ignored this. Other apps that might benefit from a Disabling / Detecting API for iOS Screen Mirroring for Apps may be Snapchat, DAZN, Sony, Netflix, Amazon Prime, etc. Is there still pans for an API that can disable this functionality now or in the future as I am sure that a company like Snapchat doesn't want people screenshotting photos using iOS Screen Mirroring and the app doesn't know. Thanks.
1
1
399
Sep ’24
CIFilter chain failing to render parts of output
I’ve built a iOS camera app that applies many CIFilters to an image captured by the camera. Some of my users have reported that on occasion the images have large parts that are blank, see below: Frustratingly, I can’t reproduce this myself! Does anyone know what could he causing it, is it a memory issue? I haven’t posted the code as there’s a lot to look over and I’m not sure it would help diagnose it. Thanks for any pointers.
1
0
398
Sep ’24
How can I use the F8 play/pause key to control media playback in Catalyst?
I have a Catalyst app that plays audio via AVQueuePlayer, and I'd like to use the system play/pause key (F8 on my MacBook Pro keyboard) to play and pause it. It doesn't seem to work automatically, and if I hook up a UIKeyCommand using UIKeyInputF8, it works with Fn-F8, not F8 on its own. It does seem to work in Overcast's Mac app, but I think that's an iPad app for Mac, not Catalyst, so it's probably going through whatever system pathway that the Lock Screen controls would be using on iOS. How do I make this work on Catalyst?
1
0
412
Aug ’24
INPlayMediaIntent `mediaSearch` mediaName unreliable when searching for playlists
We are working with an app that uses the INPlayMediaIntent to allow users to select and play music using Siri. In building out this feature, we have noticed that when selecting playlists to play, Siri will consistently leave out information from the intent that we are use to resolve the media to play in the app. It seems that there is generally no rhyme or reason as to why some information is left out. Walking through a couple test cases, here is the phrase and corresponding mediaSearch that we receive when testing: "Hey Siri, play the playlist happy songs in the app " (this is a working example) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050780> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = happy songs; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist my favorites in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050600> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist working out playlist in the app " (this fails as the term "playlist" is excluded) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x114050ae0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = working out; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } "Hey Siri, play the playlist recently added in the app " (this fails with a null mediaName) ▿ Optional<INMediaSearch> - some : <INMediaSearch: 0x1140507e0> { reference = 0; mediaType = 5; sortOrder = 0; albumName = <null>; mediaName = <null>; genreNames = ( ); artistName = <null>; moodNames = ( ); releaseDate = <null>; mediaIdentifier = <null>; } Based on the above, Siri seems to ignore playlists named "Recently Added", "My Favorites", and playlists that have the word "playlist" in them such as "Working Out Playlist". To rectify this, we attempted to set the INVocabulary for the playlist titles that a user has in the app, as suggested in this WWDC session: https://developer.apple.com/videos/play/wwdc2020/10060/ let vocabulary = INVocabulary.shared() vocabulary.setVocabularyStrings(NSOrderedSet(array: [ "my favorites", "recently added", "working out playlist" ]), of: .mediaPlaylistTitle); This seems to have no effect. We understand the note in https://developer.apple.com/documentation/sirikit/registering_custom_vocabulary_with_sirikit/ stating that "a few minutes" should be waited before testing custom vocabulary, but waiting upwards of 20 minutes and even restarting the device did not result in any of the custom vocabulary making a difference. If these playlist names are set in AppIntentVocabulary.plist, "Recently Added" and "My Favorites" are able to be discovered as playlists, but the other failed test cases remain failing. The obvious shortcoming here is that these are not dynamic. <key>ParameterVocabularies</key> <array> <dict> <key>ParameterNames</key> <array> <string>INPlayMediaIntent.playlistTitle</string> </array> <key>ParameterVocabulary</key> <array> <dict> <key>VocabularyItemIdentifier</key> <string>working out playlist</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>working out playlist</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>recently added</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>recently added</string> </dict> </array> </dict> <dict> <key>VocabularyItemIdentifier</key> <string>my favorites</string> <key>VocabularyItemSynonyms</key> <array> <dict> <key>VocabularyItemPhrase</key> <string>my favourites</string> </dict> <dict> <key>VocabularyItemPhrase</key> <string>my favorites</string> </dict> </array> </dict> </array> </dict> </array> Given the above, our questions are as follows: Is there documentation surrounding how Siri may pass along the mediaSearch in INPlayMediaIntent and how/why information may be left out? Why does setting custom vocabulary with INVocabulary seem to have no effect, yet the same vocabulary in AppIntentVocabulary does have an effect? Is the functionality we are experiencing to be expected, or should this be reported as a bug? We've published the test app that we are using for debugging this functionality at this link: https://github.com/awojnowski/SiriTest
3
0
434
Aug ’24
Can't share Video to Facebook
I have the Facebook SDK version 17.0.2 and xcode 15. Sharing photos and links work fine but when I try sharing videos, I get the following error: Failed to log access with error: access=<PATCCAccess 0x301d12b20> accessor:<<PAApplication 0x301d27e30 identifierType:auditToken identifier:{pid:18440, version:47210}>> identifier:A9159DCD-76B1-4C77-A01E-DA611929B50B kind:intervalEvent timestampAdjustment:0 visibilityState:0 assetIdentifierCount:0 accessCount:0 tccService:kTCCServicePhotos, error=Error Domain=NSCocoaErrorDomain Code=4097 "connection to service with pid 15679 named com.apple.privacyaccountingd" UserInfo={NSDebugDescription=connection to service with pid 15679 named com.apple.privacyaccountingd}
1
0
361
Aug ’24
Camera issues on iPhone 14 Pro Max after iOS 18 beta 3
4 days after installing iOS 18 beta 3 my iPhone no longer had macro control, .5 zoom, and other features for the Pro models, I’ve tried restarting my phone but nothing changed, settings Keeps saying that the Phone has an unknown part on the camera or it’s not genuine, I bought it brand new on T-mobile last year, I need help wether this is just a beta issue or an actual physical damage
3
2
1.2k
Jul ’24
PhotoAsset in TagView
I'm trying to recreate the Tag people functionality in Instagram. Where a carousel of media the user has selected is displayed to them. They can then go through and tag people to the media. I'm trying to achieve this (but with food items instead of people) with TagView using PHAssets however the result is some funky behaviour I'm pulling my hair out trying to understand. The items are tagging correctly but the scroll feature on the TabView works sporadically. It occasionally scrolls fine but all of a sudden won't let me scroll past one image.. (See attached video for example). import SwiftUI import Photos struct TagItemView: View { @Binding var selectedAssets: [PHAsset] @State private var showTagSheet = false @State private var currentAsset: PHAsset? { didSet { if let currentAsset = currentAsset { assetTags = tags[currentAsset.localIdentifier] ?? [] } } } @State private var tags: [String: [String]] = [:] // Dictionary to store tags for each media item @State private var assetTags: [String] = [] // Tags for the current asset var body: some View { VStack { mediaCarousel tagsView Spacer() } .background(Color.black.ignoresSafeArea()) .onAppear { if let firstAsset = selectedAssets.first { currentAsset = firstAsset } } .onChange(of: currentAsset) { newAsset in if let currentAsset = newAsset { assetTags = tags[currentAsset.localIdentifier] ?? [] print("currentAsset changed: \(currentAsset.localIdentifier)") print("assetTags: \(assetTags)") } } .sheet(isPresented: $showTagSheet) { TagSheetView(selectedAsset: $currentAsset, tags: $tags, showTagSheet: $showTagSheet, assetTags: $assetTags) } } private var mediaCarousel: some View { VStack { TabView(selection: $currentAsset) { ForEach(selectedAssets, id: \.self) { asset in if asset.mediaType == .image { TagItemImageView(asset: asset) .tag(asset.localIdentifier) .onAppear { currentAsset = asset print("Asset in view (onAppear): \(asset.localIdentifier)") } .onTapGesture { currentAsset = asset showTagSheet = true } } else if asset.mediaType == .video { TagItemVideoView(asset: asset) .tag(asset.localIdentifier) .onAppear { currentAsset = asset print("Asset in view (onAppear): \(asset.localIdentifier)") } .onTapGesture { currentAsset = asset showTagSheet = true } } } } .tabViewStyle(PageTabViewStyle(indexDisplayMode: .always)) .frame(height: UIScreen.main.bounds.height * 0.4) // Fixed height for carousel } } private var tagsView: some View { ScrollView { if !assetTags.isEmpty { ItemView(assetTags: assetTags, removeTag: { tag in removeTag(tag, from: currentAsset!) }) .transition(.opacity) } else { InstructionsView() .transition(.opacity) } } .background(Color.black) .padding(.top, 8) .padding(.horizontal, 15) } private func removeTag(_ tag: String, from asset: PHAsset) { guard var assetTags = tags[asset.localIdentifier] else { return } assetTags.removeAll { $0 == tag } tags[asset.localIdentifier] = assetTags if currentAsset?.localIdentifier == asset.localIdentifier { self.assetTags = assetTags } } }
0
0
298
Jul ’24
Multiview HLS with HDR
I have an HDR10+ encoded video that if loaded as a mov plays back on the Apple Vision Pro but when that video is encoded using the latest (1.23b) Apple HLS tools to generate an fMP4 - the resulting m3u8 cannot be played back in the Apple Vision Pro and I only get back a "Cannot Open" error. To generate the m3u8, I'm just calling mediafilesegmenter (with -iso-fragmented) and then variantplaylistcreator. This completes with no errors but the m3u8 will playback on the Mac using VLC but not on the Apple Vision Pro. The relevant part of the m3u8 is: #EXT-X-STREAM-INF:AVERAGE-BANDWIDTH=40022507,BANDWIDTH=48883974,VIDEO-RANGE=PQ,CODECS="ec-3,hvc1.1.60000000.L180.B0",RESOLUTION=4096x4096,FRAME-RATE=24.000,CLOSED-CAPTIONS=NONE,AUDIO="audio1",REQ-VIDEO-LAYOUT="CH-STEREO" {{url}} Has anyone been able to use the HLS tools to generate fMP4s of MV-HEVC videos with HDR10?
2
2
624
Oct ’24
Location not visible in video recorded in third party app
I recently bought an insta360 flow gimbal. when recording video with the instaflow app, I cannot see the location in apple photos app and all other apple apps. However I can see the location in windows photos app once I downloaded the videos into my windows PC. The location is also visible in android app once I share it through google account. With an exif app, I can see the location meta data in exif table as well, but again not shown as location. exiftool in my pc can also see the meta data including location as in attached screenshot. Compared to video shot with built-in camera app, I cannot find any difference in terms of location meta data. What could be wrong? I contacted insta360 app support, they do not seem to understand what's going on, just asking for very simple questions again and again like do you enable GPS location access, are you shooting video? I also contacted apple support, they are just saying it's thirdparty issue and refusing to help further. If it's really thirdparty issue how come the location data is actually embeded as meta data, and windows pc and android device can see the location? BTW, I air drop this video to all my apple devices like iPhone 15 ultra and ipad air, and very old iPhone, all of them cannot see the location.
1
0
588
Jul ’24
What is the proper way to handle videos in SwiftData ?
I'm creating an application with swiftui which gets images and videos from the Photos picker then store them with swiftData for later use. I save both images and videos as data with  @Attribute(.externalStorage). But it just seems wrong to me to store the videos that way, they can be several gigabytes in size . What is the correct way to handle something like this ? Is it to store the url and then each time the user wants to see a video save a temporary video ?. If that's the case can anyone show me how this should be done? Any comments appreciated Guillermo
0
0
391
Jun ’24