Post

Replies

Boosts

Views

Activity

AudioConverterFillComplexBuffer not working for (E)AC3 in tvOS 18
Since upgrading to tvOS 18, the above function isn't working for me in converting a stream with these formats. It does work in decoding AAC, however. https://developer.apple.com/documentation/audiotoolbox/1503098-audioconverterfillcomplexbuffer?language=objc I pass a valid ioOutputDataPacketSize in, but it always comes out as zero. Has anyone else observed this too? I wonder if this is related to the issue being discussed widely about 5.1 sound being broken for many people after upgrading to tvOS 18? https://discussions.apple.com/thread/255769102?login=true&sortBy=rank EDIT: further information; the callback gets called once, asking for 1 packet (which is ok). I give it one packet and return noErr. However, after this, the callback is never invoked again. Must be a bug? EDIT2: the same code continues to work correctly on macOS in decoding the same audio stream.
4
2
306
Oct ’24
VideoToolbox AV1 decoding on Apple platforms
As some have clocked, this was added in a recent SDK release... kCMVideoCodecType_AV1 ...does anyone know if and when AV1 decode support, even if software-only, is going to be available on Apple platforms? At the moment, one must decode using dav1d, (which is pretty performant, to be fair) but are we expecting at least software AV1 support on existing hardware any time soon, does anybody know?
4
0
3.7k
Jan ’23
MTKView, presentDrawable afterMinimumDuration seems flakey...
I downloaded this sample: https://developer.apple.com/documentation/metal/basic_tasks_and_concepts/using_metal_to_draw_a_view_s_contents?preferredLanguage=occ I commented out this line in AAPLViewController.mm //    _view.enableSetNeedsDisplay = YES; I modified the presentDrawable line in AAPLRenderer.mm to add afterMinimumDuration:     [commandBuffer presentDrawable:drawable afterMinimumDuration:1.0/60]; I then added a presentedHandler before the above line that records the time between successive presents. Most of the time it correctly reports 0.166667s. However, about every dozen or so frames (it varies) it seems to present a frame early with an internal of 0.0083333333s followed by the next frame after around 0.24s. Is this expected behaviour, I was hoping that afterMinimumDuration would specifically make things consistent. Why would it present a frame early? This is on a new MacBook Pro 16 running latest macOS Monterrey, and the sample project upgraded to have a minimum deployment target of 11.0. Xcode latest public release 13.1.
3
0
1.2k
Nov ’21
Editing my own .sh file with TextEdit sets the quarantine bit, why?
I've just bought a new MacBook Pro M1, and restored everything from my old Intel MacBook Pro using a Time Machine backup. It was a pretty smooth process, a few glitches such as needing to re-download certain apps to get the M1 version (e.g. Android Studio). One thing that I've noticed, and I don't know whether this is a Monterey thing or an M1 thing but as part of my day-to-day development work, I maintain various .sh files for building projects on different platforms. I have found that as soon as I edit and save an existing .sh file using TextEdit, it then sets the quarantine bit on the file and prevents it running from inside Zsh: zsh: operation not permitted: ./test.sh xattr yields the following: xattr ./test.sh  com.apple.TextEncoding com.apple.lastuseddate#PS com.apple.macl com.apple.metadata:kMDLabel_pjtfm5adga5rvjv2xmgkyqjwmq com.apple.quarantine This is incredibly annoying and I can't believe it is by design - this is not a file that has been downloaded from the Internet, it's my own file. Why can't I edit it using TextEdit? I do not get the same problem when I edit and save using Sublime Text, as one example, so what's with TextEdit doing that?
4
0
2.1k
Oct ’21
Lifecycle of a tvOS 13.2 TopShelf extension?
So recently I migrated the topshelf extension for my app from the deprecated TVServiceProvider to the new TVContentProvider in 13.0 and onwards.I finally got it working (not helped by wasting hours figuring out that the NSExtensionPrincipalClass has to be the first thing listed in the NSExtension dictionary in the Info.plist or the extension just terminates, I kid you not!) but there is one last thing that I can't figure out.What works:1. remove any instance of my application from the Apple TV2. install and launch my app from xcode on the Apple TV3. when i back out of the app, the topshelf code is working, it calls the loadTopShelfContentWithCompletionHandler function and I am able to give it what it wants and it gets displayed correctlyProblem is, if I terminate the application from Xcode, I can no longer get the topshelf to work when I try and launch it again from Xcode. It is not listed as running (as a process to attach to, for example). The app runs fine, but there is no extension process launched alongside it.I can get it working again in one of two ways; either A) reboot the Apple TV, in which case I find tvOS launches the extension a few seconds after boot without me doing anything (not even highlighting my app), or, B) following the steps above if I delete the instance from the Apple TV and install it again using Xcode.Essentially, it behaves as if the top shelf extension is launched once, and only once, on bootup of the Apple TV, or on first install. It appears to get terminated when I launch the app again using xcode (e.g. with a new build or something, or even running the same build) and only A) or B) above can get it running again.Has anyone else seen this?
4
0
2.2k
Nov ’19
Is Apple TV 4K going to support HLG, and how does it currently handle HLG content?
[AVPlayer availableHDRModes] shows: iPad Pro: HLG, HDR10, DV Apple TV 4K: HDR10, DV So at the moment I assume HLG content will not trigger HLG mode on the TV. However when decoding HLG content on Apple TV, I can see that the CVPixelBuffer does contain image buffer attachments specifying the correct colour space and transfer characteristics for HLG. So what does it currently do on Apple TV 4K with this content when using AVFoundation APIs such as AVSampleBufferDisplayLayer and AVPlayer etc? Does it tone map it to SDR? Convert to HDR10 somehow? Do nothing?
0
0
1.1k
Feb ’21