Delve into the world of graphics and game development. Discuss creating stunning visuals, optimizing game mechanics, and share resources for game developers.

All subtopics

Post

Replies

Boosts

Views

Activity

Game Porting Toolkit formula install error
So recently I have been trying to install game-porting-toolkit, and when I try to run brew -v install apple/apple/game-porting-toolkit Im getting an error: Error: apple/apple/game-porting-toolkit 1.1 did not build Logs: /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/00.options.out /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/01.configure.cc /Users/mateusz/Library/Logs/Homebrew/game-porting-toolkit/wine64-build If reporting this issue please do so to (not Homebrew/brew or Homebrew/homebrew-core): apple/apple I searched on this forum some threads but I couldn't take any advice from them. I would really appreciate any help.
0
0
93
1d
How to Disable Object Occlusion in a RealityView?
Hi everyone, I’m working on a project using RealityKit and encountering an issue with object occlusion. Specifically, I need to disable the occlusion of real-world objects (e.g., tables, walls) in my RealityView. I want virtual entities to render fully, even if real-world objects would normally block their view. I’ve explored options in ARSession and ARWorldTrackingConfiguration but haven’t found anything that affects occlusion in RealityView. I suspect there might be a setting or approach I’ve missed. Has anyone dealt with a similar scenario or knows how to achieve this? Any insights or pointers would be greatly appreciated! Thanks in advance, Nicolas
0
0
86
2d
D3DMetal unsupported CheckFeatureSupport query 53 while running simple vulkaninfo using Mesa 24.3 Dozen (Vulkanon12) driver..
Hi, wanted to test if possible to use Mesa3D Dozen driver(Vulkan on D3D12 )+D3DMetal 2b3 to get maybe better Vulkan driver on Wine than default MoltenVK.. this will support Vulkan windows apps via using D3D12Metal.. using vulkan_dzn.dll,dzn_icd.x86_64.json,dxil.dll from x64 folder from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z using simple vulkaninfo app and running like: wine64 vulkaninfo I get error: [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 also seems D3DMetal Wine integration on Whisky doesn't expose d3d12core.dll and d3d12.dll like new Agility D3D12 dlls or VKD3D, so getting: MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore but anyways seems to try to load the driver as: WARNING: dzn is not a conformant Vulkan implementation, testing use only. full log: MESA: error: Failed to retrieve D3D12GetInterface MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E328. MESA: error: Failed to load DXCore WARNING: dzn is not a conformant Vulkan implementation, testing use only. [D3DMetal:LOG:2A825] Unsupported API: CheckFeatureSupport, unhandled support query 53 00bc:fixme:dcomp:DCompositionCreateDevice 0000000000000000, {c37ea93a-e7aa-450d-b16f-9746cb0407f3}, 000000000011E578. ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Call to 'vkEnumeratePhysicalDevices' in ICD c:\windows\system32\.\vulkan_dzn.dll failed with error code -3 ERROR: [Loader Message] Code 0 : setup_loader_term_phys_devs: Failed to detect any valid GPUs in the current config ERROR at C:\j\msdk0\build\Khronos-Tools\repo\vulkaninfo\vulkaninfo.h:241:vkEnumeratePhysicalDevices failed with ERROR_INITIALIZATION_FAILED
0
0
124
3d
Simulator vs device; Deferring System Gestures
Hi, I have a full screen iOS game. In the Simulator, preferredScreenEdgesDeferringSystemGestures works well to prevent a drag down from the top of the full-screen play area from immediately opening the Notification Center. (You got a tab, and had to swipe down on it.) But at least on my device, when testing, unlike in Simulator, Notification Center opens immediately when I swipe down from the top. Any suggestions?
0
0
102
3d
Updated Object Capure -- needs LiDAR?
I have two apps released -- ReefScan and ReefBuild -- that are based on the WWDC21 sample photogrammetry apps for iOS and MacOS. Those run fine without LiDAR and are used mostly for underwater models where LiDAR does not work at all. It now appears that the updated photogrammetry session requires LiDAR data, and building my app on current xcode results in a non-working app. Has the "old" version of photgrammetry session been broken by this update? It worked very well previously so I would hate to see this regression to needing LiDAR. Most of my users do not have that.
0
0
78
3d
D3DMetal unsupported CreateCommandQueue1 API while running simple wglgears using Mesa 24.3 GLon12 driver..
Hi, wanted to test if possible to use Mesa3D OGLon12+D3DMetal 2b3 to get GL>4.1 support on windows apps via D3D12Metal.. using simple wglgears.c app (similar glxgears) and running like: GALLIUM_DRIVER=d3d12 wine64 wglgears64 -info with overridden opengl32.dll using contents from: https://github.com/pal1000/mesa-dist-win/releases/download/24.3.0-rc1/mesa3d-24.3.0-rc1-release-msvc.7z I get: [D3DMetal:LOG:5E53] Unsupported API: CreateCommandQueue1 caused by: https://gitlab.freedesktop.org/mesa/mesa/-/commit/c022c9603d500b59ff5e6f93c8a214d1785ab20a API: https://learn.microsoft.com/en-us/windows/win32/api/d3d12/nf-d3d12-id3d12device9-createcommandqueue1 note setup is correct as using: GALLIUM_DRIVER=llvmpipe wine64 wglgears64 -info I get: GL_RENDERER = llvmpipe (LLVM 19.1.3, 128 bits) GL_VERSION = 4.5 (Compatibility Profile) Mesa 24.3.0-rc1 (git-85ba713d76) GL_VENDOR = Mesa GL_EXTENSIONS = GL_ARB_multisample GL_EXT_abgr GL_EXT_bgra GL_EXT_blend_color GL_EXT_blend_minmax GL_EXT_blend_subtract r GL_EXT_texture.. etc..
0
0
49
4d
how to get a null acceleration structure w/o trigging an API validation error
I want to turn off my ray-tracing conditionally. There's is_null_acceleration_structure but when I don't bind an acceleration structure (or pass nil to setFragmentAccelerationStructure), I get the following API validation error: -[MTLDebugRenderCommandEncoder validateCommonDrawErrors:]:5782: failed assertion `Draw Errors Validation Fragment Function(vol_deferred_lighting): missing instanceAccelerationStructure binding at index 6 for accelerationStructure[0]. I can turn off API validation and it works, but it seems like I should be able to use nil for the acceleration structure w/o triggering a validation error. Seems like a bug, right? I suppose I can work around this by creating a separate pipeline with the ray-tracing disabled via a function constant instead of using is_null_acceleration_structure. (Can we get a ray-tracing tag for questions?)
1
0
93
4d
NSScreen frame location with multiple monitors
I have a Mac Studio 2023 M2 Max Running Sonoma 14.6.1 Developing in XCode 16.1 It seems that the NSScreen frame settings may be incorrect. The frame settings received from NSScreen.screens don't seem to match up with the Desktop arrangement settings in the Settings. Apologies in advance for this long post! for screen in NSScreen.screens { let name = screen.localizedName Globals.logger.debug("Globals initializeScreens - screen \(i) '\(name, privacy: .public)'") Globals.logger.debug("Globals initializeScreens - '\(screen.debugDescription, privacy: .public)'") } This is what I receive in the log: Globals initializeScreens - '<NSScreen: 0x600000ef4240; name="PHL 346E2C"; backingScaleFactor=1.000000; frame={{0, 0}, {3440, 1440}}; visibleFrame={{0, 0}, {3440, 1415}}>' Globals initializeScreens - screen 2 'Blackmagic (1)' Globals initializeScreens - '<NSScreen: 0x600000ef42a0; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>' Globals initializeScreens - screen 3 'Blackmagic (4)' Globals initializeScreens - '<NSScreen: 0x600000ef4360; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>' Globals initializeScreens - screen 4 'Blackmagic (2)' Globals initializeScreens - '<NSScreen: 0x600000ef43c0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>' Globals initializeScreens - screen 5 'Blackmagic (3)' Globals initializeScreens - '<NSScreen: 0x600000ef4420; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>' It looks like the frame settings for Blackmagic (2) and Blackmagic (4) are switched. The setup has five monitors. Four are using the USB-C Digital AV Multiport Adapters. The output for these are streamed into a rack of A/V equipment using BlackMagic Design mini converters and monitors. My Swift application allows users to open four movies, one for each of the AV Adapters. The movies can then be played back in sync for later processing by the A/V equipment. Here are some screen captures that show my display settings. Blackmagic (1) and Blackmagic (2) are to the left of the main screen. Blackmagic (3) and Blackmagic(4) are to the right of the main screen. The desktop is hard to see but is correct. The wallpaper settings are all correct. The wallpaper is correctly ordered when displayed on the monitors. After opening the movies and using the NSScreen frame settings, the displays are incorrectly ordered. Test B and Test D are switched, which is what I would expect given the NSScreen frame values. Any ideas? I've tried re-arranging the desktops, rebooting, etc. but no luck. The code that changes the screen location is similar to this post on Stack Overflow public func setDisplay( screen: NSScreen ) { Globals.logger.log("MovieWindowController - setDisplay = \(screen.localizedName, privacy: .public)") Globals.logger.debug("MovieWindowController - setDisplay - '\(screen.debugDescription, privacy: .public)'") let dx = CGFloat(Constants.midX) let dy = CGFloat(Constants.midY) var pos = NSPoint() pos.x = screen.visibleFrame.midX - dx pos.y = screen.visibleFrame.midY - dy Globals.logger.debug("MovieWindowController - setDisplay - x = '\(pos.x, privacy: .public)', y = '\(pos.y, privacy: .public)'") window?.setFrameOrigin(pos) } The log show just what I would expect given the incorrect frame values. MovieWindowController - setDisplay = Blackmagic (1) MovieWindowController - setDisplay - '<NSScreen: 0x6000018e8420; name="Blackmagic (1)"; backingScaleFactor=1.000000; frame={{-3840, 0}, {1920, 1080}}; visibleFrame={{-3840, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '-3840.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (2) MovieWindowController - setDisplay - '<NSScreen: 0x6000018a10e0; name="Blackmagic (2)"; backingScaleFactor=1.000000; frame={{5360, 0}, {1920, 1080}}; visibleFrame={{5360, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '5360.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (3) MovieWindowController - setDisplay - '<NSScreen: 0x6000018cc8a0; name="Blackmagic (3)"; backingScaleFactor=1.000000; frame={{3440, 0}, {1920, 1080}}; visibleFrame={{3440, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '3440.000000', y = '-12.500000' MovieWindowController - setDisplay = Blackmagic (4) MovieWindowController - setDisplay - '<NSScreen: 0x6000018c9ce0; name="Blackmagic (4)"; backingScaleFactor=1.000000; frame={{-1920, 0}, {1920, 1080}}; visibleFrame={{-1920, 0}, {1920, 1055}}>' MovieWindowController - setDisplay - x = '-1920.000000', y = '-12.500000' Am I correct? I think this is driving me crazy! Thanks in advance! Edit: The mouse behavior is correct in moving across the displays!
2
0
95
4d
Rotating ModelEntities (without Gestures) Help
I'm building a proof of concept application leveraging the PlaneDetectionProvider to generate UI and interactive elements on a horizontal plane the user is looking at. I'm able to create a cube at the centroid of the plane and change it's location via position. However, I can't seem to rotate the cube programmatically and from this forum post in September I'm not sure if the modelEntity.move functionality is still bugged or the documentation is not up to date. if let planeCentroid = planeEntity.centroid { // Create a cube at the centroid let cubeMesh = MeshResource.generateBox(size: 0.1) // Create a cube with side length of 0.1 meters let cubeMaterial = SimpleMaterial(color: .blue, isMetallic: false) let cubeEntity = ModelEntity(mesh: cubeMesh, materials: [cubeMaterial]) cubeEntity.position = planeCentroid cubeEntity.position.y += 0.3048 planeEntity.addChild(cubeEntity) let rotationY = simd_quatf(angle: Float(45.0 * .pi/180.0), axis: SIMD3(x: 0, y: 1, z: 0)) let cubeTransform = Transform(rotation: rotationY) cubeEntity.move(to: cubeTransform, relativeTo: planeEntity, duration: 5, timingFunction: .linear) Ideally, I'd like to have the cube start/stop rotation when the user pinches on the plane mesh but I'd be happy just to see it rotate!
2
0
156
5d
Why does the planeNode in SceneKit flicker when using class property instead of a local variable?
I am working on a SceneKit project where I use a CAShapeLayer as the content for SCNMaterial's diffuse.contents to display a progress bar. Here's my initial code: func setupProgressWithCAShapeLayer() { let progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } progressLayer.strokeEnd = progress // Update progress } } } // MARK: - ARSCNViewDelegate func renderer(_ renderer: SCNSceneRenderer, didAdd node: SCNNode, for anchor: ARAnchor) { progressBarPlane = SCNPlane(width: 0.2, height: 0.2) setupProgressWithCAShapeLayer() let planeNode = SCNNode(geometry: progressBarPlane) planeNode.position = SCNVector3(x: 0, y: 0.2, z: 0) node.addChildNode(planeNode) } This works fine, and the progress bar updates smoothly. However, when I change the code to use a class property (self.progressLayer) instead of a local variable, the rendering starts flickering on the screen: func setupProgressWithCAShapeLayer() { self.progressLayer = createProgressLayer() progressBarPlane?.firstMaterial?.diffuse.contents = progressLayer DispatchQueue.main.async { [weak self] in var progress: CGFloat = 0.0 Timer.scheduledTimer(withTimeInterval: 0.1, repeats: true) { [weak self] timer in progress += 0.01 if progress > 1.0 { progress = 0.0 } self?.progressLayer?.strokeEnd = progress // Update progress } } } After this change, the progressBarPlane in SceneKit starts flickering while being rendered on the screen. My Question: Why does switching from a local variable (progressLayer) to a class property (self.progressLayer) cause the flickering issue in SceneKit rendering?
0
0
71
6d
How to get GKMatch instance after accepting GKInvite?
In my SceneKit game I'm able to connect two players with GKMatchmakerViewController. Now I want to support the scenario where one of them disconnects and wants to reconnect. I tried to do this with this code: nonisolated public func match(_ match: GKMatch, player: GKPlayer, didChange state: GKPlayerConnectionState) { Task { @MainActor in switch state { case .connected: break case .disconnected, .unknown: let matchRequest = GKMatchRequest() matchRequest.recipients = [player] do { try await GKMatchmaker.shared().addPlayers(to: match, matchRequest: matchRequest) } catch { } @unknown default: break } } } nonisolated public func player(_ player: GKPlayer, didAccept invite: GKInvite) { guard let viewController = GKMatchmakerViewController(invite: invite) else { return } viewController.matchmakerDelegate = self present(viewController) } But after presenting the view controller with GKMatchmakerViewController(invite:), nothing else happens. I would expect matchmakerViewController(_:didFind:) to be called, or how would I get an instance of GKMatch? Here is the code I use to reproduce the issue, and below the reproduction steps. Code Run the attached project on an iPad and a Mac simultaneously. On both devices, tap the ship to connect to GameCenter. Create an automatched match by tapping the rightmost icon on both devices. When the two devices are matched, on iPad close the dialog and tap on the ship to disconnect from GameCenter. Wait some time until the Mac detects the disconnect and automatically sends an invitation to join again. When the notification arrives on the iPad, tap it, then tap the ship to connect to GameCenter again. The iPad receives the call player(_:didAccept:), but nothing else, so there’s no way to get a GKMatch instance again.
2
0
149
6d
ARSCNView ignores output of SCNTechnique (sometimes)
I am using SCNTechnique in combination with ARSCNView. The technique is doing so minor post-processing. I have written several filter variant for this post-processing, but I'm facing an issue when with one of the filters/fragment shaders, SCNTechnique discards my output and just presents the plain camera feed on screen instead. This is clearly visible in the Metal pipeline, using the GPU frame debugger. Let me stress that my setup works for 90% of my filters, but not this one and I want to know why. iOS 18.1, iPhone 13 Mini. Xcode 16.1. Encoder 0 & 1 are injected by the system. Render encoder 2 & 3 correspond to my SCNTechnique's render passes: one to manipulate pixel data (darken it in this case) and another to BLIT it back to the main texture. I know the separate buffer is not strictly for this particular operation, but it shouldn't matter. Note that the issue occurs in encoder 4 (not mine but ARKit's). In Render Encoder 4, scn_postprocess_AR_fragment handle my texture (#0, ending in f980) and another from the camera feed (Texture 2). I know this pass is typically used for grain because that's what it used to do before I disabled grain on ARSCNView (+ the buffer still contains grain paramaters). I have other post-processing filters that work just fine. By what magic is ARKit determining to use Texture 2 instead of my Texture 0? Sure, I could keep digging into the minute differences between my shaders to find out which LoC affects how some ARKit shader down the line operates, but it's awfully opaque so far.
0
0
142
6d
Unable to test Game Center achievements in ad hoc builds to registered device
Hi, I have a test app with a single "game centre achievement" and I am running it on my iPad and unable to list that achievement with GameCenterManager.shared.loadAchievements The app is still in version 1.0, prepare for submission status, (it is not ready for review submission). Game Center entitlement is added for the app and the achievement is added to the Game Center section of app. However it is marked as NotLive. I am using a sandbox account to login to game centre on the iPad and I can't fetch this achievement. Is it because it is "NotLive". ? How do I test my Game Center achivement on the device without releasing it yet.
1
0
166
1w
GKGameCenterViewController won't turn off.
GKGameCenterViewController won't turn off. With Core and GameKit from GitHub apple/unityplugins, I succeeded in logging in and displaying GKGameCenterViewController. Other leaderboards and everything work fine, but when I press X on GameCenter to return to the game, nothing happens. I tried debugging by printing logs here and there in the plugin to check, but I didn't get any results. When I press X, I couldn't get any logs or responses. It was like a button with no listener attached. No, it was more like an image. Based on the community posts that said it worked fine before, it seems that the recent GameCenter update was not applied to the plugin, was omitted, or changed, causing a mismatch.
1
0
131
1w
SCNNode into SKScene is deformed when hit an object
Into a SKScene, I add a SCNSphere as a child of SKShapeNode, as depicted below. When the sphere hit another node (the fence in the example) the sphere is deformed as it were elastic. I didn't found any information about elastic properties. Someone know a way to avoid the deformation? import SwiftUI import SpriteKit import SceneKit @main struct MyApp: App { var body: some Scene { WindowGroup {SpriteView(scene: GameSceneSK(size: UIScreen.main.bounds.size))} } } class GameSceneSK: SKScene { override func sceneDidLoad() { var fencePoints = [ CGPoint(x: 300, y: 0), CGPoint(x: 300, y: 400), CGPoint(x: 0, y: 400) ] let fence = SKShapeNode(points: &fencePoints, count: fencePoints.count) fence.physicsBody = SKPhysicsBody(edgeChainFrom: fence.path!) addChild(fence) let sphereGeometry = SCNSphere(radius: 20) let sphereNode = SCNNode(geometry: sphereGeometry) let sphereScnScene = SCNScene() sphereScnScene.rootNode.addChildNode(sphereNode) let ball3D = SK3DNode(viewportSize: CGSize(width: 40, height: 40)) ball3D.scnScene = sphereScnScene let ball = SKShapeNode(circleOfRadius: 20) ball.physicsBody = SKPhysicsBody(circleOfRadius: 20) ball.addChild(ball3D) physicsWorld.gravity = CGVector(dx: 0.2, dy: 0.2) addChild(ball) } }
2
0
156
1w
Unreal Engine SocketIO crash
I’m trying to build my project using Unreal Engine 5.4 for iOS. I use SocketIO to connect to the backend. The plugin SocketIOClient version 2.8.0, is used for this. When trying to connect to the socket, a crash occurs. This only happens on iOS, only in Distribution builds, and only on Unreal 5.4. There are no problems on Unreal 5.2. Callstack: crashlog.crash The callstack may be slightly different, but the problem is always when allocating or deallocating memory. I suspect that this may be a race condition and is related to some peculiarities of working with memory on iOS. There are also several similar issues in the plugin repository, for example this one. I tried using other versions of plugin, other versions of xcode, tried to build socket io using both c++20 and c++17, nothing helps. Does anyone know what can be done about this?
0
0
136
1w
Which Apple technologies to use for simple 2d motion graphics software?
I plan to create a simple motion graphics software for macOS that animates text, basic shapes, and handles audio. I'll use SwiftUI for the UI. What are the commonly used technologies for rendering animated graphics? Core Animation is suitable for UI animations but not for exporting and controlling UI animations. Basic requirements: Timeline user interface Animation of text and basic shapes Viewer in SwiftUI GUI with transport control (play, pause, scrub, …) Export to video file Is Metal or Core Graphics typically used directly? I want to keep it as simple as possible.
0
0
169
1w
How to roll a ball by physic in RealityKit
I decided to use a club to kick a ball and let it roll on the turf in RealityKit, but now I can only let it slide but can not roll. I add collision on the turf(static), club (kinematic) and the ball(dynamic), and set some parameters: radius, mass. Using these parameters calculate linear damping, inertia, besides, use time between frames and the club position to calculate speed. Code like these: let radius: Float = 0.025 let mass: Float = 0.04593 // 质量,单位:kg var inertia = 2/5 * mass * pow(radius, 2) let currentPosition = entity.position(relativeTo: nil) let distance = distance(currentPosition, rgfc.lastPosition) let deltaTime = Float(context.deltaTime) let speed = distance / deltaTime let C_d: Float = 0.47 //阻力系数 let linearDamping = 0.5 * 1.2 * pow(speed, 2) * .pi * pow(radius, 2) * C_d //线性阻尼(1.2表示空气密度) entity.components[PhysicsBodyComponent.self]?.massProperties.inertia = SIMD3<Float>(inertia, inertia, inertia) entity.components[PhysicsBodyComponent.self]?.linearDamping = linearDamping // force let acceleration = speed / deltaTime let forceDirection = normalize(currentPosition - rgfc.lastPosition) let forceMultiplier: Float = 1.0 let appliedForce = forceDirection * mass * acceleration * forceMultiplier entityCollidedWith.addForce(appliedForce, at: rgfc.hitPosition, relativeTo: nil) Also I try to applyImpulse but not addForce, like: let linearImpulse = forceDirection * speed * forceMultiplier * mass No matter how I adjust the friction(static, dynamic) and restitution, using addForce or applyImpulse, the ball can only slide. How can I solve this problem?
0
0
191
1w
Rendering YCbCr input using Metal
I would like to take YCbCr CVPixelBuffers from AVCaptureVideoDataOutput, apply some processing in RGB space, render to an MTKView, and pass to AVAssetWriter for recording. Right now, I'm doing this all manually – deswing the incoming data if necessary, choose the right matrix to convert to RGB, apply processing, etc. I also have to convert back to YCbCr before feeding the frames to AVAssetWriter because encoding performs much better if I do. Is there any efficient, built-in way to achieve the same? I can't use AVCaptureVideoPreviewLayer, since I need to do some further processing before display. I can't use AVCaptureVideoDataOutput's videoSettings to get automatic BGRA conversion because that would lose bit depth for 10 bit video formats (and isn't available on all formats anyway). I see these Accelerate functions, but they seemingly don't use the GPU, nor do they support all the formats and bit depths I'd need. I found reference to some undocumented MTLPixelFormats that seem to do exactly what I want, but I don't want to rely on something like this unless it's explicitly endorsed. This would also incur an RGB/YCbCr conversion on every texture read and write, right? Is there anything I'm missing here?
0
0
174
1w