I will be filling audio and video buffers with randomly distributed data for each frame in real time. Initializing these arrays with Floats
inside basic for
loop somehow seems naive. Are there any optimised methods for this task in iOS libraries? I was looking for data-science oriented framework from Apple, did not found one, but maybe Accelerate, Metal, or CoreML are good candidates to research? Is my thinking correct, and if so, can you guide me?
Optimising initialisation of big arrays with random data
You can use the BNNS library for this. BNNSRandomFillNormalFloat generates random number with normal distribution and BNNSRandomFillUniformFloat generates random with uniform distribution.
The following code is an example of generating random numbers with normal distribution:
func randomFloats(n: Int,
mean: Float,
standardDeviation: Float) -> [Float] {
let result = [Float](unsafeUninitializedCapacity: n) {
buffer, unsafeUninitializedCapacity in
guard
var arrayDescriptor = BNNSNDArrayDescriptor(
data: buffer,
shape: .vector(n)),
let randomNumberGenerator = BNNSCreateRandomGenerator(
BNNSRandomGeneratorMethodAES_CTR,
nil) else {
fatalError()
}
BNNSRandomFillNormalFloat(
randomNumberGenerator,
&arrayDescriptor,
mean,
standardDeviation)
unsafeUninitializedCapacity = n
BNNSDestroyRandomGenerator(randomNumberGenerator)
}
return result
}