Post

Replies

Boosts

Views

Activity

ARConfiguration variable providesAudioData crashes with 3rd-party microphone.
We currently have an ARKit app on the store which uses ARKit to provide both camera images and audio data for export to video. So far this works well and without issues. We discovered recently that if we plug in a 3rd-party microphone (e.g., RODE VideoMic Me-L) into the Lightning port on an iPhone 11 - 13, the app seems to freeze and crash on starting the ARSession with the following error: com.apple.arkit.error Code=102 "Required sensor failed." This does not, despite the error, seem to be related to Microphone permissions (which searches on this topic have brought up). At this point we have added most plist permissions to the app with no success. We can reproduce this with Apple's own RealityKit2 "Underwater" example. Simply add the following line to line 216 of UnderwaterView.swift: configuration.providesAudioData = true // ADD ME configuration.planeDetection.insert(.horizontal) session.run(configuration) Plug in the 3rd party mic (e.g., RODE VideoMic Me-L), run the app, and it will bomb out with: 2022-01-11 15:59:48.710585+0000 Underwater[3089:1945857] [Session] ARSession <0x113428a80>: did fail with error: Error Domain=com.apple.arkit.error Code=102 "Required sensor failed." UserInfo={NSLocalizedFailureReason=A sensor failed to deliver the required input., NSUnderlyingError=0x2819d1320 {Error Domain=AVFoundationErrorDomain Code=-11800 "The operation could not be completed" UserInfo={NSLocalizedFailureReason=An unknown error occurred (-12780), NSLocalizedDescription=The operation could not be completed, NSUnderlyingError=0x2819d1f20 {Error Domain=NSOSStatusErrorDomain Code=-12780 "(null)"}}}, NSLocalizedRecoverySuggestion=Make sure that the application has the required privacy settings., NSLocalizedDescription=Required sensor failed.} We're struggling to find a solution here and could really use the advice if anyone understands what the issue may be. Otherwise, we'll submit a bug report to Apple and see where we go from there.
1
0
1.2k
Jan ’22
SceneKit Metal shader no longer compiles in Xcode 13.
I have an app developed as a SceneKit AR app. It has been fine for a long time but in XCode 13 the shader now fails to compile and results in the dreaded purple shader of death in-app. It was modified from the simple mesh shader included in Apple's original ARKit example app. #include <metal_stdlib> #include <simd/simd.h> #pragma body #pragma transparent uniform simd_float2 worldDimensions; const float numTiles = 40.0; const float tilePercent = 1.0 / numTiles; const float edgeThickness = (1.0 / numTiles) * 0.05; The first error I get now is the following: [SceneKit] Error: FATAL ERROR : failed compiling shader: Error Domain=MTLLibraryErrorDomain Code=3 "program_source:3179:50: error: unknown type name 'simd_float2' , constant simd_float2& worldDimensions If I change this from simd_float2 to float2 then the error explodes into the larger cascade of: [SceneKit] Error: FATAL ERROR : failed compiling shader: Error Domain=MTLLibraryErrorDomain Code=3 "program_source:992:1: error: use of class template 'uniform' requires template arguments uniform float2 worldDimensions; ^ /System/Library/PrivateFrameworks/GPUCompiler.framework/Libraries/lib/clang/31001.325/include/metal/metal_uniform:16:8: note: template is declared here struct uniform; ^ program_source:992:15: error: expected ';' after top level declarator uniform float2 worldDimensions; At this point I'm unable to see anything in the XCode changelogs etc. that offer any assistance as to what has changed in the spec. Any help gratefully appreciated. The entire shader: #include <metal_stdlib> #include <simd/simd.h> #pragma body #pragma transparent uniform float2 worldDimensions; const float numTiles = 40.0; const float tilePercent = 1.0 / numTiles; const float edgeThickness = (1.0 / numTiles) * 0.05; const simd_float2 cm10 = worldDimensions * 10.0; float u = _surface.diffuseTexcoord.x * (1.0 + edgeThickness); float v = _surface.diffuseTexcoord.y * (1.0 + edgeThickness); const float2 thicknessMajor = float2(0.005, 0.005) / worldDimensions; const float2 thicknessMinor = float2(0.005, 0.005) / worldDimensions; bool onMeter = (fmod(u, 1.0 / worldDimensions.x) < thicknessMajor.x) || (fmod(v, 1.0 / worldDimensions.y) < thicknessMajor.y); onMeter = onMeter || (u > 1.0 - thicknessMajor.x) || (v > 1.0 - thicknessMajor.y); bool onCM = (fmod(u, 1.0 / cm10.x) < thicknessMinor.x) || (fmod(v, 1.0 / cm10.y) < thicknessMinor.y); float4 defaultColor = _surface.diffuse; float alpha = onMeter ? 0.7 : 0.1; float4 outColor = float4(defaultColor.xyz, 1.0); _surface.diffuse = float4(outColor.xyz, alpha); _surface.transparent = float4(1.0, 1.0, 1.0, alpha); if (!onCM && !onMeter) discard_fragment();
2
0
1.3k
Sep ’21
SceneKit, overriding the clone() method of SCNNode
I've become used to inheriting SCNNode for my App and therefore adding extra variables/methods to the inherited class. However, I've encountered a situation where I need to create a clone of the object (e.g., a copy where the geometry, materials etc. are references rather than a freshly instantiated object) https://developer.apple.com/documentation/scenekit/scnnode/1408046-clone However, cloning the inherited class does not, of course, clone the extra member variables in the inherited class. So I have attempted to override clone(), but i can't find an example of how this override is meant to be used in inherited classes? class MyNode : SCNNode { let url: URL let someStruct: MyStruct init?(url: url, someStruct: someStruct) { self.url = url self.someStruct = someStruct } override func clone() - Self { /* No idea what I'm meant to do in here? How do I 'clone' url and someStruct members here? Not sure about the following code either. / super.clone() return self } }
3
0
845
Apr ’21
SCNRenderer in MTKView doesn't render shadows.
I have an MTKView in which I want to first render a SceneKit scene onto a Render Texture and then pass this render texture into my own post-processing stage. It works fine, I see my Scene rendered correctly with all the objects, but there are no shadows being renderered and I've worked out that it's the SCNRenderer call that's not drawing them (the scene in a SCNView, ARSCNView etc. will show shadows fine). i.e., the nodes are casting correctly, and the light is set to cast shadows, so it's not a scene issue. Presumably I've not set up the descriptor with extra details that it needs for shadow rendering, but I'm not certain what the SCNRenderer is actually needing here? Sadly the documentation isn't revealing a great to deal to me on this subject. Any pointers would be much appreciated, no matter how general. I can't even find examples of the SCNRenderer being used in similar situations. let descriptor = MTLRenderPassDescriptor() descriptor.colorAttachments[0].loadAction = .clear descriptor.colorAttachments[0].clearColor = MTLClearColor(red: 0.0, green: 0.0, blue: 0.0, alpha: 1.0) descriptor.colorAttachments[0].texture = metalRenderTexture descriptor.colorAttachments[0].storeAction = .store if let commandBuffer = commandQueue.makeCommandBuffer() { commandBuffer.label = "3DSceneBuffer" if isDrawingPreVizScene { sceneKitRenderer.render(withViewport: CGRect(x: 0, y: 0, width: metalRenderTexture!.width, height: metalRenderTexture!.height), commandBuffer: commandBuffer, passDescriptor: descriptor) } commandBuffer.commit() }
1
0
1.1k
Mar ’21