Post

Replies

Boosts

Views

Activity

Reply to ARKit stops providing Audio
So I narrowed it down to the fact that I'm appending the audio samples. It seems that maybe I'm holding on to the samples for too long and that is causing this to happen. But I need to write the audio samples to an audio file...how can I circumvent this? I tried implementing an audio file writer with AVFoundation, but even that causes the. audio samples to stop coming after a while (125 samples that I save)...its honestly super weird..
May ’23
Reply to ARKit stops providing Audio
I finally fixed this not appending the Buffers to an array and writing the audio file later but rather setting up an audio-writer class that takes samples in real time and writes them to a file. For anybody who is facing the same problem, here is my class (probably very poorly and inefficiently written, so beware xD): public class audioWriter: ObservableObject{ var assetWriter: AVAssetWriter var audioInput: AVAssetWriterInput init(assetWriter: AVAssetWriter, audioInput: AVAssetWriterInput) { self.assetWriter = assetWriter self.audioInput = audioInput } func startWriting(firstBuffer: CMSampleBuffer){ audioInput.expectsMediaDataInRealTime = true assetWriter.add(audioInput) assetWriter.startWriting() assetWriter.startSession(atSourceTime: CMSampleBufferGetPresentationTimeStamp(firstBuffer)) } func addSample(sample: CMSampleBuffer){ audioInput.append(sample) } func finish(){ audioInput.markAsFinished() assetWriter.finishWriting{} } } The class is then called like this: func startAudio(url: URL){ let audioOutputSettings: [String: Any] = [ AVFormatIDKey: kAudioFormatLinearPCM, AVSampleRateKey: 44100.0, AVNumberOfChannelsKey: 1, AVLinearPCMBitDepthKey: 16, AVLinearPCMIsBigEndianKey: false, AVLinearPCMIsFloatKey: false, AVLinearPCMIsNonInterleaved: false ] let assetWriter = try? AVAssetWriter(outputURL: url, fileType: AVFileType.wav) let audioInput = AVAssetWriterInput(mediaType: AVMediaType.audio, outputSettings: audioOutputSettings) recorder = audioWriter(assetWriter: assetWriter!, audioInput: audioInput) recorder?.startWriting(firstBuffer: buffer!) recording = true } The recording = true is for my session func in the ARSessionObserver I am also using to call the startAudio function: func session( _ session: ARSession, didOutputAudioSampleBuffer audioSampleBuffer: CMSampleBuffer){ buffer = audioSampleBuffer if recording{ print("Audio Sample: \(count)") count += 1 let newBuffer = try? CMSampleBuffer(copying: buffer!) recorder?.addSample(sample: newBuffer!) } } I am not sure if I need the detour of copying the buffer before writing it to my audio file with let newBuffer = try? CMSampleBuffer(copying: buffer!) , if anybody knows more, please do tell : ) This has worked for me, but if anyone comes up with a more efficient way, feel free to share :D
May ’23
Reply to View like the camera app
Hello, I've been trying to find a solution for this as well and currently only found UIKit implementations that sadly didn't work when applied to SwiftUI. But I managed to gather a solution that seems a bit more SwiftUI friendly, you can check it out here. Hope this helps : )
Oct ’23