Posts

Post not yet marked as solved
5 Replies
794 Views
Not too hopeful that anyone can explain this but here is goes. I have some C code being used from an iOS app in Swift. Logs in the C code are passed by a callback to Swift and put on a serial queue using: Log.serialQueue.async {} So, the C function could look like: int do_some_c_stuff(void) { log("Do some logging"); } And in Swift we have something like this to process the log that came through the callback: class func log(_ message: String, logInfo: LogInfo = appLogInfo, type: OSLogType = .default) { Log.serialQueue.async { os.os_log("%@", log: logInfo.log, type: type, message) } } This works perfectly in all cases except one (Intel iPhone simulator only). Now, some C functions allocate a static buffer to parse incoming messages. Like this: int do_some_c_stuff(void) { log("Do some logging"); char buf[100000]; } and here is the interesting part. If this buffer exceeds exactly 249440 bytes, any call to Log.serialQueue.async in the swift layer gets a EXC_BAD_ACCESS code=2 but only when running on Intel simulator. Running on device or M1 simulator works just fine. So on the Intel simulator this will crash calling Log.serialQueue.async: int do_some_c_stuff(void) { log("Do some logging"); // This will trigger the callback inside log which ends up in the swift layer. char buf[249441]; // buffer exceeds 249440 bytes } Also note that it is the presence of this allocation that causes issues on Intel, returning before the allocation does not help, if the allocation is present in the C function, the call to Log.serialQueue.async crash. Further, it is not the logging in the swift layer that causes the problem, simply calling Log.serialQueue.async without anything inside crashes. So, the example below still crash on Intel when accessing the serialQueue.async so I assume the large memory chunk is allocated when the function is "created", not when the buf variable is instantiated. int do_some_c_stuff(void) { log("Do some logging"); return 0; char buf[249441]; } It only happens in the Intel simulator and only in Debug mode. It is 100% reproducible in various places in the codebase, all of them using C functions that declare a local buffer larger than 249440 bytes. I do not have a minimal example at this time, hoping that someone might have an idea on why it happens but if someone is interested, maybe I can whip something up. In general, just having the C function allocate this large block and from the same function callback to swift and use dispatch.async should do the trick. Is there some sort of memory swapping, paging etc that would cause problems in a scenario like this mixing C and dispatchqueue (on intel only)? Since the solution is to reduce the stack allocation or use heap memory, this is not critical. However, if anyone knows why this is happening on Intel CPU's it would be super interesting to know.
Posted Last updated
.
Post not yet marked as solved
0 Replies
818 Views
Hi, In iOS 14.7.1 (testing on iPhone 6S) doing this: AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode:AVAudioSession.Mode.default, options: [.mixWithOthers, .defaultToSpeaker]) outputs the audio on speaker even if I have a Bluetooth headset connected. In iOS 15.6 (testing on iPhone 11) audio is always output to Bluetooth even if .allowBluetooth is not set in setCategory. Is this an intended change in iOS15 or a bug? I have tried overrideOutputAudioPort(.speaker) as well in iOS15 but it does not make a difference, audio is always output over Bluetooth when connected. Thanks in advance
Posted Last updated
.
Post marked as solved
2 Replies
1.9k Views
Hi, I have found multiple questions on this topic but none of the proposals solves my problem so please stay with me on the description below since I am obviously missing something. I have an iOS Framework that links with multiple static libraries from a C Open Source project using "Other Linker flags" and "Library search paths". These .a files are all fat binaries including arm64 and x86_64 versions. This Framework is later used by a regular iOS app after being packed into an XCFramework (do note, when generating the XCFramework I use EXCLUDED_ARCHS="arm64" for the simulator build since a long time and I think this is also wrong). Everything works fine on device but I have never gotten the Simulator to work on my M1 Mac and now I am trying to fix that but without success, when trying to start the simulator on M1 I get this error for the first fat .a found: building for iOS Simulator, but linking in object file built for iOS, for architecture arm64 I do not want to exclude arm64 arch for the simulator as suggested in some posts since I want to be able to run the app in arm64 mode on the Simulator. Also, when starting the simulator on my M1 (not using Rosetta) I am expecting it to run in arm64 mode so then it makes no sense that it complains that the static lib is an arm64 lib right? I have tried changing Build only active archs and other suggestions but nothing has helped. I also checked the fat binary used from the framework with otools as suggested by @eskimo in another post and it spat out this: Load command 1 cmd LC_BUILD_VERSION cmdsize 24 platform 2 minos 13.0 sdk 15.4 ntools 0 which I think is correct, PLATFORM_IOS. I assume the arm64 .a file that runs in the xCode Simulator is the same as the arm64 .a that runs on device right? Or do I have to build arm64 for the Mac platform to run the iOS simulator on M1? Very grateful for any ideas on what might be wrong.
Posted Last updated
.
Post marked as solved
2 Replies
719 Views
Hi, I have a log to file function that as part of the logline inserts the current datetime as below: class func logToFile(message: String, logInfo: LogInfo = appLogInfo, type: OSLogType = .default) {           let formatter = DateFormatter()     formatter.dateFormat = "yyyy-MM-dd'T'HH:mm:ss.SSS"     formatter.timeZone = TimeZone(secondsFromGMT: TimeZone.current.secondsFromGMT())     formatter.locale = Locale(identifier: "en_US_POSIX")     let localDate = formatter.string(from: Date())           let logMessage = "\(localDate) [\(logInfo.category)] \(message)\n"           let documentDirPath = NSSearchPathForDirectoriesInDomains(.documentDirectory, .userDomainMask, true)[0]           let filePath = "\(documentDirPath)/\(logFileName)"           if let fileHandle = FileHandle.init(forWritingAtPath: filePath), let data = logMessage.data(using: .utf8) {       fileHandle.seekToEndOfFile()       fileHandle.write(data)     }   } and it is always called from a serial queue: Log.serialQueue.async {       logToFile(message: message, logInfo: logInfo, type: type)     } Lately however, I am getting some crashes and I managed to catch one in the debugger. It happens when deallocating the local DateFormatter when exiting the logToFile function. Xcode stops at the end of the function: And the thread indicates that it is inside NSDateFormatter dealloc and clicking on the ici::Dataformat destructor, the assembly code shows this: And in the inspector, we can see that localDate is half-baked, it only contains the year and day: I have found some posts on DateFormatter not being thread safe but that was way in the past. In my case I only want to convert a Date() to String. Something is not working as expected and suggestions on how to improve it would be very welcome. Thanks in advance
Posted Last updated
.