Posts

Post not yet marked as solved
0 Replies
283 Views
When I use websocket to get data from the network, and use AVAudioEngine scheduleBuffer every time I get data, my CPU usage and memory will increase rapidly after the continuous fast scheduleBuffer. Will the memory be automatically released after the AVAudioEngine plays the AVAudioPCMBuffer? And how can I reduce CPU usage? Inside the link is a screenshot of some of the code and debugging datahttps://github.com/J844259756/picture
Posted Last updated
.
Post not yet marked as solved
1 Replies
368 Views
I use websocket to get data from the network. When I get data from the network, I will receive the data scheduleBuffer. However, when I run a fast and continuous scheduleBuffer, the CPU utilization and memory footprint will keep increasing. And how can I reduce my CPU usage?The following is part of my code and a screenshot of the detection
Posted Last updated
.
Post not yet marked as solved
5 Replies
2.2k Views
I want to put the binary PCM data obtained from the network into the AVAudioEngine every time, but when I get the data from the background and scheduleBuffer, the CPU consumption will be high, and there will be noise. My code is as follows, may I ask how to optimize my code- (void)viewDidLoad { [super viewDidLoad]; [self initWebSocket:server_ip]; self.engine = [[AVAudioEngine alloc] init]; self.playerNode = [[AVAudioPlayerNode alloc] init]; [self.engine attachNode:self.playerNode]; self.format = [[AVAudioFormat alloc] initWithCommonFormat:AVAudioPCMFormatFloat32 sampleRate:(double)48000.0 channels:(AVAudioChannelCount)1 interleaved:NO]; AVAudioMixerNode *mainMixer = [self.engine mainMixerNode]; [self.engine connect:self.playerNode to:mainMixer format:self.format]; if (!self.engine.isRunning) { [self.engine prepare]; NSError *error; BOOL success; success = [self.engine startAndReturnError:&error]; NSAssert(success, @"couldn't start engine, %@", [error localizedDescription]); } [self.playerNode play]; }- (void)webSocket:(SRWebSocket *)webSocket didReceiveMessage:(id)message{ NSData *data = message; AVAudioPCMBuffer *pcmBuffer = [[AVAudioPCMBuffer alloc] initWithPCMFormat:self.format frameCapacity:(uint32_t)(data.length) /self.format.streamDescription->mBytesPerFrame]; pcmBuffer.frameLength = pcmBuffer.frameCapacity; [data getBytes:*pcmBuffer.floatChannelData length:data.length]; [self.playerNode scheduleBuffer:pcmBuffer completionHandler:nil];}
Posted Last updated
.