Posts

Post not yet marked as solved
4 Replies
990 Views
I just upgraded to Xcode 14.3. I have started seeing the following debug message show up in my console and I am not sure why. [GPUDebug] Null texture access executing kernel function "ColorCorrection" encoder: "Keyframing.SurfaceWarpFuser.InverseWarpKeyframe", dispatch: 2 It seems Metal related, but I am very confused by it. My project uses a very minimal amount of Metal. Only to get depth data from ARKit and to draw points, and I definitely do NOT have any kernel functions named ColorCorrection or an encoder named "Keyframing.SurfaceWarpFuser.InverseWarpKeyframe" I haven't changed any of my metal code, so I don't know if this is something bigger I should be concerned about. Sincerely, Stan
Posted Last updated
.
Post not yet marked as solved
6 Replies
2.5k Views
I hope someone here can give me some insight, because I am at my wits end. I have been trying to learn Metal the past couple of months. In the process, I came across an examples and articles of Sorting Networks and decided to try and implement them in Metal. Now the problem is, if I run the code on my Mac. Everything is fine. But if I run the the same code on my iDevice (iPadPro wLIDAR), I get all sort of errors I do not understand or sorted data is corrupted and all wrong. Typical Error 2021-02-17 12:13:11.218394-0500 METAL_ISSUE[97650:6709092] [GPUDebug] Invalid device load executing kernel function "bitonicSort" encoder: "0", dispatch: 0, at offset 384 file:///Users/staque/Development/OTHER/METAL_ISSUE/METAL_ISSUE/Shaders.metal:77:40 - bitonicSort() MTLBufferArgument: 0x28006d200         Name = floats          Type = MTLArgumentTypeBuffer          Access = MTLArgumentAccessReadWrite          LocationIndex = 0          IsActive = 1          ArrayLength = 1          TypeInfo =              DataType = MTLDataTypePointer              ElementType = MTLDataTypeFloat               Access = MTLArgumentAccessReadWrite              Alignment = 4              DataSize = 4         Alignment = 4          DataSize = 4          DataType = MTLDataTypeFloat buffer: "unknown" You can pretty much drop these in the default Xcode Metal Game default app. Shader (slightly modified to track the indexes of the floats.) /*  [Using Code based off of this](https://github.com/tgymnich/MetalSort)  Rewritten to make it more understandable.  */ kernel void bitonicSort(device float *floats [[ buffer(0) ]], device int *uInts [[ buffer(1) ]], constant int &p [[ buffer(2) ]], constant int &q [[ buffer(3) ]], uint gid [[ thread_position_in_grid ]]) { int pMinusQ = p-q; int distance = 1 pMinusQ; uint gidShiftedByP = gid p; // True: Increasing / False: Descreasing bool direction = (gidShiftedByP & 2) == 0; uint gidDistance = (gid & distance); bool isGidDistanceZero = (gidDistance == 0); uint gidPlusDistance = (gid | distance); bool isLowerIndexGreaterThanHigher = (floats[gid] floats[gidPlusDistance]); if (isGidDistanceZero && isLowerIndexGreaterThanHigher == direction) { float temp = floats[gid]; floats[gid] = floats[gidPlusDistance]; floats[gidPlusDistance] = temp; int temp2 = uInts[gid]; uInts[gid] = uInts[gidPlusDistance] uInts[gidPlusDistance] = temp2; } } The call. language func runSort() { let device = MTLCreateSystemDefaultDevice()! let commandQueue = device.makeCommandQueue()! let library = device.makeDefaultLibrary()! let sortFunction = library.makeFunction(name: "bitonicSort")! let pipeline = try! device.makeComputePipelineState(function: sortFunction) let setRange = 0..1024 var floatData = [Float]() var uintData = [UInt32]() // Build the Float and index data backward to form worst case scenerio for sorting. for value in stride(from: Float(setRange.upperBound-1), to: Float(setRange.lowerBound-1), by: -1.0) { floatData.append(value) } for value in stride(from: setRange.upperBound-1, to: setRange.lowerBound-1, by: -1) { uintData.append(UInt32(value)) } print(floatData) print("") print(uintData) guard let logn = Int(exactly: log2(Double(floatData.count))) else { fatalError("data.count is not a power of 2") } for p in 0..logn { for q in 0..p+1 { let floatDataBuffer = device.makeBuffer(bytes: &floatData, length: MemoryLayoutFloat.stride * floatData.count, options: [.storageModeShared])! floatDataBuffer.label = "floatDataBuffer" let uintDataBuffer = device.makeBuffer(bytes: &uintData,   length: MemoryLayoutUInt32.stride * uintData.count,   options: [.storageModeShared])! uintDataBuffer.label = "uintDataBuffer" let threadgroupsPerGrid = MTLSize(width: floatData.count, height: 1, depth: 1) let threadsPerThreadgroup = MTLSize(width: pipeline.threadExecutionWidth, height: 1, depth: 1) var n1 = p var n2 = q let commandBuffer = commandQueue.makeCommandBuffer()! let encoder = commandBuffer.makeComputeCommandEncoder()! encoder.setComputePipelineState(pipeline) encoder.setBuffer(floatDataBuffer, offset: 0, index: 0) encoder.setBuffer(uintDataBuffer, offset: 0, index: 1) encoder.setBytes(&n1, length: MemoryLayoutFloat.stride, index: 2) encoder.setBytes(&n2, length: MemoryLayoutUInt32.stride, index: 3) encoder.dispatchThreadgroups(threadgroupsPerGrid, threadsPerThreadgroup: threadsPerThreadgroup) encoder.endEncoding() commandBuffer.commit() commandBuffer.waitUntilCompleted() let dataPointer = floatDataBuffer.contents().assumingMemoryBound(to: Float.self) let dataBufferPointer = UnsafeMutableBufferPointer(start: dataPointer, count: floatData.count) floatData = Array.init(dataBufferPointer) let dataPointer2 = uintDataBuffer.contents().assumingMemoryBound(to: UInt32.self) let dataBufferPointer2 = UnsafeMutableBufferPointer(start: dataPointer2, count: uintData.count) uintData = Array.init(dataBufferPointer2) } } print(floatData) print("") print(uintData) } If anyone has a clue what I should be doing I am all ears, because I need help. Thanks in advance. Stan
Posted Last updated
.
Post marked as solved
3 Replies
1.4k Views
Before the update ARView.session seemed to be accessible, but now after the update it is not.The documentation says it should be therehttps://developer.apple.com/documentation/realitykit/arviewSo I very confused why I am getting an 'Use of unresolved identifier 'session'', since the update.Can anyone tell me how to fix?Stan
Posted Last updated
.
Post marked as solved
1 Replies
571 Views
Hey All, How do you get the vertical field of view angle for the LiDAR camera when using ARKit (when in landscape mode specifically). There are a couple of references to the horizontal field of view scattered in other documentation. However, the notes seem to be referring to the regular camera (and give 60 degrees as the default). It there some obvious documentation I am missing, if not, how do I calculate it from the information I can get. Thanks Stan
Posted Last updated
.
Post not yet marked as solved
1 Replies
2.7k Views
I have some Metal Code, that goes along fine and then my app starts throwing the following errors. Execution of the command buffer was aborted due to an error during execution. Caused GPU Address Fault Error (IOAF code 11) and then... Execution of the command buffer was aborted due to an error during execution. Ignored (for causing prior/excessive GPU errors) (IOAF code 4) I am guessing the prior causes the later, but I can't figure out what is causing the first exactly. Is it possible to set up a Symbolic Break Point or something to catch these types of errors? Right now I feel like I am grasping in the dark to figure out what is going wrong in my code. Thanks Stan
Posted Last updated
.
Post not yet marked as solved
0 Replies
427 Views
I have a consistent problem with an app I am developing and I have no idea what is going on or how to fix it.The short version is I am unable to instantiate a custom AUv3 I made. Two make matters stranger when I compare this behavior to a Apple's AUv3Host demo application, the AVAudioUnitComponentManager instance in that application is unable to find any of the third-party (non-Apple) AUv3s installed on my iPad , including the custom one I made. This issue persists across restarts of my app. Once it starts the only way I am able to fix either of these problems, is I have to shut my app, and Apple demo app down completely, and then open GarageBand and go through the motions like I am adding a new audio track with an AUv3. Once I get to the page where all the third part AUv3s are displayed, I can close GarageBand, restart debugging my app and the AVAudioUnitComponentManager can find everything in the Apple Demo app (including the custom AUv3 I am using).Has anyone had a problem like this, and if so how do you fix it to make sure third-party or custom AUv3's are always found?This is how I instantiating my custom AUv3, but I don't think I am doing anything weird. AVAudioUnitMIDIInstrument.instantiate(with: theAudioComponentDescription, options: auOptions, completionHandler: { avAudioUnit, error in guard error == nil, let theAVAudioUnit = avAudioUnit else { os_log(.error, log: self.logger, "Error instantiating SearSynth AU.") return }Thanks stan.
Posted Last updated
.
Post not yet marked as solved
1 Replies
611 Views
I am trying to use AVAudioEnvironmentNode with ARKit.I am not using ARSCNKit, just ARKit with a regular UIKit.There problem is when I set the AVAudioEnvironmentNode.listenerPosition to matching ARFrame's camera forward, up, and position using the ARSessionDelegate and session(_ session: ARSession, didUpdate frame: ARFrame), AVAudioEnvironmentNode does not seem to hold its position.Audiably it sounds like it is jumping from the position I try to set it at, and back to the origin.zero, with the left and right channels giving out sproatically.If I print out listenerVectorOrientation, and listenerPosition after I set it, the vectors for the orientation pretty much matches the camera (good), but the listenerPosition never wavers from Vector(0,0,0) (bad)Does anyone what is going on here, or at least set the AVAudioEnvironmentNode.listenerPosition so it sticks, and just plain how do I stop this problem?ThanksStan
Posted Last updated
.
Post not yet marked as solved
0 Replies
486 Views
Does anyone know how one adopts the AVAudioMixing Protocol when building a custom AVAudioUnit node?Like documentation, or better yet an example show what needs to be done?I am a little confused, because the documentation reads that an custon node has to adopt it so a AVAudioMixerNode and AVAudioEnvironmentNode downstream to respond to it.But I dont' understand when you fill in the AVAudioMixing protocol stub with? Are you telling your custom node to do something, or are you telling the AVAudioMixerNode and AVAudioEnviromentNodes downstream to do something (makes more sense)? But being a protocol that implies that a delegate is being used from the AVAudioMixerNode and AVAudioEnviromentNodes, and I don't get how one gets a referrence to that.Thanks in Advances.Staque
Posted Last updated
.