Help with showing a spectrogram of an audio file

Hello everyone!

I am trying to create a spectrogram like in the attached image for a macOs App. I am using Cocoa/AppKit but also have some SwiftUI views, so I can use either. I have found the sample app Visualizing Sound as an Audio Spectogram that apple provides but I do not want a real-time spectrogram. I want a spectrogram of the whole audio file. I have been trying to convert the sample app to what I need but I have been unsuccessful so far. Here is how I changed the code in the delegate

	public func captureBuffer() {

		let asset = AVAsset(url: audioFileUrl)

		let reader = try! AVAssetReader(asset: asset)

		let track = asset.tracks(withMediaType: AVMediaType.audio)[0]
			let settings = [

				AVFormatIDKey : kAudioFormatLinearPCM

			]

			let readerOutput = AVAssetReaderTrackOutput(track: track, outputSettings: settings)

			reader.add(readerOutput)

			reader.startReading()
		while let buffer = readerOutput.copyNextSampleBuffer() {

		   var audioBufferList = AudioBufferList(mNumberBuffers: 1, mBuffers: AudioBuffer(mNumberChannels: 0, mDataByteSize: 0, mData: nil))

		   var blockBuffer: CMBlockBuffer?
		   CMSampleBufferGetAudioBufferListWithRetainedBlockBuffer(

			buffer,

			bufferListSizeNeededOut: nil,

			bufferListOut: &audioBufferList,

			bufferListSize:  MemoryLayout<AudioBufferList>.size,

			blockBufferAllocator: nil,

			blockBufferMemoryAllocator: nil,

			flags: kCMSampleBufferFlag_AudioBufferList_Assure16ByteAlignment,

			blockBufferOut: &blockBuffer

		   );
			let buffers = UnsafeBufferPointer<AudioBuffer>(start: &audioBufferList.mBuffers, count: Int(audioBufferList.mNumberBuffers))

			for buffer in buffers {
				let samplesCount = Int(buffer.mDataByteSize) / MemoryLayout<Int16>.size
				let samplesPointer = audioBufferList.mBuffers.mData!.bindMemory(to: Int16.self, capacity: samplesCount)
				let samples = UnsafeMutableBufferPointer<Int16>(start: samplesPointer, count: samplesCount)

			}
			guard let data = audioBufferList.mBuffers.mData else {

				return
			}

			/// The _Nyquist frequency_ is the highest frequency that a sampled system can properly

			/// reproduce and is half the sampling rate of such a system. Although  this app doesn't use

			/// `nyquistFrequency` you may find this code useful to add an overlay to the user interface.

			if nyquistFrequency == nil {

				let duration = Float(CMSampleBufferGetDuration(buffer).value)

				let timescale = Float(CMSampleBufferGetDuration(buffer).timescale)

				let numsamples = Float(CMSampleBufferGetNumSamples(buffer))

				nyquistFrequency = 0.5 / (duration / timescale / numsamples)

			}

			if self.rawAudioData.count < AudioSpectrogram.sampleCount * 2 {

				let actualSampleCount = CMSampleBufferGetNumSamples(buffer)

				let ptr = data.bindMemory(to: Int16.self, capacity: actualSampleCount)

				let buf = UnsafeBufferPointer(start: ptr, count: actualSampleCount)

				rawAudioData.append(contentsOf: Array(buf))

			}

			while self.rawAudioData.count >= AudioSpectrogram.sampleCount {

				let dataToProcess = Array(self.rawAudioData[0 ..< AudioSpectrogram.sampleCount])

				self.rawAudioData.removeFirst(AudioSpectrogram.hopCount)

				self.processData(values: dataToProcess)

			}
			createAudioSpectrogram()

		}

	}
}


I am sure there are different or better ways to go about this, but the only examples I can find are on iOS and use UIKit, but I am building for MacOs. Does anyone know how display a spectrogram for an audio file without having to play the audio file? I dont mind using sox or ffmpeg if that is easier.

Greatly appreciated!

Can you say more about what is going wrong with your current attempt?

@Media Engineer, Sorry for the late reply. I have been trying to upload an image showing my results but it won't let me. The image that I get is mostly yellow and orange dots. I don't really know why this is happening or what steps I should take to try and fix it, but I am sure my code incorrect. Have you been able to successfully create a spectrogram from an audio file before using Swift? Just curious. Thanks!

Help with showing a spectrogram of an audio file
 
 
Q