Hello,
I’m seeking some clarity regarding the memory storage of application code and static libraries.
I understand the basic memory layout in terms of the code (text) segment, data segment, heap, and stack:
• Code Segment (Text Segment): Typically stores the compiled program code.
• Data Segment: Stores global and static variables.
• Heap: Dynamically allocated memory during runtime.
• Stack: Stores local variables and function call information.
However, I’ve come across some conflicting information:
1. Official Documentation: In an illustration from Apple’s official documentation, it appeared as though application code might be stored in the heap. This seemed unusual given my understanding that compiled code is generally stored in the code segment.
from document archive
2. Blog Posts: Several blogs mention that the source code for static libraries is stored in the heap. This also contradicts my understanding since static libraries, after being linked, should be part of the application’s executable code, thus residing in the code segment.
Given these points, my understanding is that:
• Application Code: After compilation, the executable code should be stored in the code segment.
• Static Libraries: Once linked, the code from static libraries should also be part of the code segment.
Could you please clarify:
• Where exactly is the application code stored in memory?
• Is the claim that static libraries’ source code is stored in the heap correct, or is it a misunderstanding?
Thank you!
Post
Replies
Boosts
Views
Activity
I have a data that should be transmitted using UDP with NWConnection.
Size of data is larger than maximum datagram size provided by UDP, so I need to split data to send it.
I tried to find solution with searching about UDP data split, but couldn't find appropriate solution to my problem.
Which keyword should I search about to start from?
Hi, I'm trying to send audio data via UDP.
I am using Network.framework with networking, so to use send method in NWConnection sending data must be Data type or confirm to DataProtocol.
To satisfy those conditions, I have implemented a method to convert from AVAudioPCMBuffer type to Data type.
func makeDataFromPCMBuffer(buffer: AVAudioPCMBuffer, time: AVAudioTime) -> Data {
let audioBuffer = buffer.audioBufferList.pointee.mBuffers
let data: Data!
data = .init(bytes: audioBuffer.mData!, count: Int(audioBuffer.mDataByteSize))
return data
}
Implementation above is referenced from this post
The problem is that the size of converted data is too big to fit in UDP datagram and error below occurs when I try to send data.
I have found out that initial size of buffer is too big to fit in maximumDatagramSize.
Below is code regarding to buffer.
let tapNode: AVAudioNode = mixerNode
let format = tapNode.outputFormat(forBus: 0)
tapNode.installTap(onBus: 0, bufferSize: 4096, format: format, block: { (buffer, time) in
// size of buffer: AVAudioPCMBuffer is 19200 already.
let bufferData = self.makeDataFromPCMBuffer(buffer: buffer, time: time)
sharedConnection?.sendRecordedBuffer(buffer: bufferData)
})
I need to reduce size of AVAudioPCMBuffer to fit in UDP datagram, But I can't find right way to do it.
What would be best way to make data fit in datagram?
I thought of dividing data in half, but this is UDP so I'm not sure how to handle those datas when one data has lost.
So I'm trying to make AVAudioPCMBuffer fit in datagram.
Any help would be very appreciated!
I am trying to send and receive an audio data(saved by .caf extension device) using Network.framework.
This is my plan.
convert file into Data type using below code:
guard let data = try? Data(contentsOf: recordedDocumentURL) else {
print("recorded file to data conversion failed in touchUpCallButton Method")
return
}
send Data type data using NWConnection.send, UDP.
receive Data type data using NWConnection.receiveMessage
convert received Data type data into AVAudioFile type data and play it using AVAudioEngine
Right now my problem is that size of data converted from audio file is too big to fit in maximumDatagramSize to send.
So in my understanding, I need to split Data type into many small bytes of data and send it one by one.
But in this case, I need to collect received data to make complete audio file again, so received device can play complete audio file.
And.. I'm stuck at this step.
I can't find right solution to divide Data type into small pieces to send by datagram using UDP.
What I have in my mind is to use 'subdata(in: Range<Data.Index>) -> Data' function and 'append(Data)' function to divide and sum up data.
Is this right approach to solve my problem?
Little advice would be very appreciated!
Hi, I'm trying to build a Walki-talkie app using Swift.
My Idea is...
record user's voice by AVAudioEngine in device1
convert recorded file into Data type data
send data from device1 to device2 using NWConnection.send
receive data using NWConnection.receiveMessage
play received data in device2
I am implementing this app using P2P option in Network.framework, so each device has both browser and listener.
And I have to make each device to keep receiving incoming data, and to send recorded voices.
At first I thought that if receiveMessage method was executed, it would wait for other device's send method to send data and receive it.
But while debugging, program didn't stopped at receiveMessage method, it just went through and executed next line.
I must be missing something, but I'm not sure what it is.
Below is send and receive part of code I tried.
func sendRecordedAudio(data: Data) {
guard let connection = connection else {
print("connection optional unwrap failed: sendRecordedAudio")
return
}
connection.send(content: data, completion: .contentProcessed({ (error) in
if let error = error {
print("Send error: \(error)")
}
}))
}
func receiveRecordedAudio() {
guard let connection = connection else {
print("connection optional unwrap failed: receiveRecordedAudio")
return
}
connection.receiveMessage{ (data, context, isComplete, error) in
if let error = error {
print("\(error) occurred in receiveRecordedAudio")
}
if let data = data {
self.delegate?.receivedAudio(data: data)
}
}
}
App is calling sendRecordAudio when recording audio is ended, and calling receiveRecordeAudio when user pressed receive button.
Any help would be greatly appreciated!