Posts

Post not yet marked as solved
2 Replies
1k Views
I'm working on a DriverKit driver. I have it running on macOS, including a very simple client app written in SwiftUI. Everything is working fine there. I've added iPadOS as a destination for the app as demonstrated in the WWDC video on DriverKit for iPadOS. The app builds and runs on my iPad, as expected (after a little work to conditionalize out my use of SystemExtensions.framework for installation on macOS). However, after installing and running the app on an iPad, the driver does not show up in Settings->General, nor in the app-specific settings pane triggered by the inclusion of a settings bundle in the app. I've confirmed that the dext is indeed being included in the app bundle when built for iPadOS (in MyApp.app/SystemExtensions/com.me.MyApp.MyDriver.dext). I also can see in the build log that there's a validation step for the dext, and that seems to be succeeding. I don't know why the app isn't being discovered -- or in any case surfaced to the user -- when the app is installed on the iPad. Has anyone faced this problem and solved it? Are there ways to troubleshoot installation/discovery of an embedded DriverKit extensions on iOS? Unlike on macOS, I don't really see any relevant console messages.
Posted
by armadsen.
Last updated
.
Post not yet marked as solved
1 Replies
1.2k Views
I've got a custom Metal Core Image kernel (written with CIImageProcessorKernel) that I'm trying to make work properly with HDR video (HDR10 PQ to start). I understand that for HDR video, the rgb values coming into the shader can have values below 0.0 or above 1.0. However, I don't understand how the 10-bit integer values (ie. 0-1023) in the video are mapped into floating point. What are the minimum and maximum values in floating point? ie. What will a 1023 (pure white) pixel be in floating point in the shader. At 11:32 in WWDC20 session 10009, - https://developer.apple.com/videos/play/wwdc2020/10009/ Edit and play back HDR video with AVFoundation, there's an example of a Core Image Metal kernel that isn't HDR aware and therefore won't work. It's inverting the values that come in by subtracting them from 1.0, which clearly breaks down when 1.0 is not the maximum possible value. How should this be implemented to be HDR aware? metal extern “C” float4 ColorInverter(coreimage::sample_t s, coreimage::destination dest) {   return float4(1.0 - s.r, 1.0 - s.g, 1.0 - s.b, 1.0); }
Posted
by armadsen.
Last updated
.