Posts

Post not yet marked as solved
0 Replies
1k Views
Device Id has always been different for devices with different hardware specs, unfortunately this is not the case for the last release Apple TV 4K (3rd generation). Is there a way to identify this new Apple TV on its two different versions (with or without Ethernet)? The issue is that they both have the same Device Id (AppleTv14.1). The difference between them is in Model Id (A2737 vs A2843). Any idea about how to get this Model Identifier? More info: https://en.wikipedia.org/wiki/Apple_TV#Technical_specifications
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
2 Replies
834 Views
In order to provide to our users more information about the different audio tracks available in our streams, I am trying to identify Dolby audio tracks from the AVAssetMediaSelectionGroup that I get from the asset. To do that, I am trying to get the audio codec from each AVMediaSelectionKeyValueOption. That way, from the codec, I could identify this track as Dolby, Stereo, etc... Sadly, I did not find this information available in AVMediaSelectionKeyValueOption. Is there some way that I could use to get this data? For the log, this is I am currently able to get from each media selection: (lldb) po playerItem.asset.mediaSelectionGroup(forMediaCharacteristic: .audible) ▿ Optional<AVMediaSelectionGroup>   - some : <AVAssetMediaSelectionGroup: 0x28306fa00, options = (     "<AVMediaSelectionKeyValueOption: 0x28306d480, language = en, mediaType = 'soun', title = English, default = YES>",     "<AVMediaSelectionKeyValueOption: 0x28306d380, language = nar, mediaType = 'soun', title = Other (nar)>" ), allowsEmptySelection = YES>
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
1 Replies
981 Views
I am trying to implement the new iPod wheel style scrubbing gesture that was recently added to AVPlayerController on my custom Player UI built on top of AVPlayer . I can not find any new API reporting this kind of gestures from the remote control, is there some API that I am missing that is actually reporting these gestures? Otherwise, how would you recommend to implement something like that? Should I use the GameController SDK in order to get the actual position of the user's finger?
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
0 Replies
1.7k Views
In order to measure the quality of service of video playback in my App, I need to measure the initial buffer size of AVPlayer when the content actually start playing. To do so, I am adding a one-time periodic time observer to the first millisecond of the playback and inspecting from there the property segmentsDownloadedDuration of the last AVPlayerItemAccessLogEvent. Sadly the values that I am getting with this approach don't match the values that I see server-side. Server-side, if I multiply the numbers of segments initially requested by segments duration, the value that I get mismatch by up to 6 seconds compared with the reports from my client App. Is there any better approach for getting this data? Is there something we might be missing or doing wrong? You can find here below an example of the client approach: import UIKit import AVFoundation class PlaygroundViewController: UIViewController {     override func viewDidLoad() { &#9;&#9;&#9;&#9;super.viewDidLoad()         let videoURL = URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8")!         // Setup Player Layer         playerLayer = AVPlayerLayer(player: player)         playerLayer.frame = view.bounds         view.layer.addSublayer(playerLayer)         // Setup Player Item         let asset = AVAsset(url: videoURL)         playerItem = AVPlayerItem(asset: asset)         // Setup AVPlayer         player = AVPlayer(playerItem: playerItem)         // Setup one time observer         var contentStartedObserver: Any?         contentStartedObserver = player?.addPeriodicTimeObserver(forInterval: CMTimeMake(value: 1, timescale: 1000), queue: DispatchQueue.main, using: {             [weak self] _ in             print("Content Started Playing")             guard let accessLog = self?.playerItem.accessLog() else { return }             guard let lastEvent = accessLog.events.last else { return }             print(">> Buffered Duration:")             print(lastEvent.segmentsDownloadedDuration)             // Ok, we are done. Remove the observer &#9;&#9;&#9;&#9;&#9;&#9;self?.player?.removeTimeObserver(contentStartedObserver)         })         // Play         playerLayer.player = player         player.play()     }     // MARK: - Private     private var playerLayer: AVPlayerLayer!     private var player: AVPlayer!     private var playerItem: AVPlayerItem! }
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
2 Replies
1.8k Views
We are facing an issue with the audio tracks displayed by AVPlayerViewController in our tvOS client. When we have a stream with alternative audio tracks, we set for sure the language and the name in the manifest of the HLS stream as described by https://tools.ietf.org/html/rfc8216 eg: NAME="French",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="fr" This works for common audio tracks with different languages. The Audio Track is then displayed correctly by AVPlayerViewController, showing the name of the language localized based on the user's device language preference. But as soon as we offer more special audio tracks like "Miscellaneous" our audio track is defined like: NAME="Other (mis)",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="mis" When it comes to displaying these audio tracks to the customer, they appear as "mis". This cryptic Audio Track name is the issue. Other "special" audio tracks with the same issue are: mis, for "uncoded languages"; for languages that have no code yet assigned mul, for "multiple languages"; should be applied when many languages are used and it is not practical to specify all the appropriate language codes nar, for “narrated audio stream”; audio descriptions to hear narration describing a video clip qaa-qtz, a range reserved for local use → original language, not specified = OV und, for "undetermined"; is provided for those situations in which a language or languages must be indicated but the language cannot be identified zxx, for "no linguistic content; not applicable"; may be applied in a situation in which a language identifier is required by system definition, but the item being described does not actually contain linguistic content Is there a way to display custom display names for the audio tracks in the AVPlayerViewController audio selector? Use case Our use case is an App which plays live tv streams. The signal comes from the broadcasters and we have no influence on it. Therefore we can not avoid using any special language code that they decide to use to describe their audio content. At the same time though, we want to make it as user friendly as possible, and to display a human readable name to our users for all language code in a consistent way. That includes those unknown language codes, but it also includes displaying known language codes with a consistent naming through the whole platform. Observed behaviour The UI for the Audio track selector in AVPlayerViewController takes into account the LANGUAGE attribute from HLS. We observed that when the language uses a language code (e.g. en-US) it is translated to the appropriate localized string to show the user based on their device language preference. On the contrary, when including non language code values, including special audio track codes also part of ISO-639-3 as "mis", "mul", etc... we observed that they are not localized and they are directly displayed as a string. That means that given the following language in the manifest, the audio selector will show the name "Hello World" NAME="French",DEFAULT=NO,AUTOSELECT=YES,LANGUAGE="Hello World"
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
1 Replies
835 Views
Due to legal restrictions I need to avoid that the users of my App can seek forward a playing content in AVPlayerViewController. The only way to do this that I found has been setting the property requiresLinearPlayback to true in my instance of AVPlayerViewController. Sadly, apart from seeking forward, using this property also limit seeking backwards. Is there a way to limit seeking forward in AVPlayerViewController, but still allowing seeking backwards?
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
1 Replies
1.1k Views
Due to legal restrictions I need to avoid that the user of my App can skip the content that is currently playing by AVPlayerViewController. To do so, I am setting false to the properties isSkipBackwardEnabled and isSkipForwardEnabled. But this does not seems to have any effect on the skipping behaviour, I can still click left or right on Siri Remote to seek 10 seconds back or forward. This there a way to prevent skipping content in AVPlayerViewController? You can find right below an example code reproducing the issue: import UIKit import AVFoundation import AVKit class ViewController: AVPlayerViewController {     override func viewDidAppear(_ animated: Bool) {         super.viewDidAppear(animated)         skippingBehavior = .default         isSkipBackwardEnabled = false         isSkipForwardEnabled = false         play(stream: URL(string: "https://devstreaming-cdn.apple.com/videos/streaming/examples/img_bipbop_adv_example_ts/master.m3u8")!)     }     // MARK: - Private     private func play(stream: URL) {         let asset = AVAsset(url: stream)         let playetItem = AVPlayerItem(asset: asset)         player = AVPlayer(playerItem: playetItem)         player?.play()     } }
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
2 Replies
1k Views
I have an App in which I am using NSPersistentCloudKitContainer to store user's drawings, and to have them in sync across multiple devices. To store the drawings, I am using a "Binary Data" attribute in my Core Data entity. In this attribute I am saving the data representation of the drawing by using this API https://developer.apple.com/documentation/pencilkit/pkdrawing/3281878-datarepresentation I have observed that for some drawings this data representation consumes a huge amount of data, and that generates synchronisation problems among devices. Is there a better approach to persist PKDrawings without this problem?
Posted
by dcordero.
Last updated
.
Post marked as solved
5 Replies
1.3k Views
In my team we are using breakpoints to improve our development cycle. We do have for example shared breakpoints which autofill the credentials form. That way we can avoid always having to use Siri Remote to write usernames and passwords, without any change in our code. Sadly at the moment the name that Xcode is giving to the breakpoints in the Breakpoints Navigator is not very meaningful for this type of use cases. Xcode is naming the breakpoints with the name of the function and the number of the line. That makes very hard to find the right endpoint that you really want to activate. As an example, in the previous use case, when having multiple shared breakpoints with different credentials to autofill, it means having the same name for all the breakpoints. Recently, since Xcode 11.4.1 we saw that there is a new "Name" field in the breakpoint forms, but providing a name does not make the breakpoint appear with that label in Xcode. Is there a way to name shared breakpoint using this new "Name" field, or any other method?
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
I am implementing a SearchController for tvOS. I have observed that the keyboard collapses automatically when the focus is moved from the keyboard to the results, if there is a UIScrollView in the results controller, (Because of also being scrollviews, that includes having UICollectionView or UITableView in the results.) This is the behaviour even when the search results fits on the available space under the keyboard. Which at the end results in a bad UX leaving a lot of empty space and no direct access to the keyboard. On the other hand when the results are not scrollable, the keyboard does not collapse. Is there a way to avoid this automatic behaviour to make the keyboard always visible? A project reproducing this behaviour can be found in here: https://github.com/dcordero/tvos-keyboard
Posted
by dcordero.
Last updated
.
Post not yet marked as solved
0 Replies
515 Views
I am trying to create a UI Test in my tvOS App that requires a swipe gesture to present an overlay menu. Even though that gesture can be easily triggered by the users by using Siri Remote, I am not able to replicate that behaviour in my tests making use of XCUIRemote. XCUIRemote seems to be limited to pressing up, down, left, right, select, menu, play/pause and home. Is there some alternative approach to trigger swipe gestures for UI Tests?
Posted
by dcordero.
Last updated
.