What you're looking for is the calling convetion for Objective-C on ARM64. While the ARM64 calling convention specifies that the first 8 arguments are passed on the first 8 registers (x0 through x7), there's also a notable difference between x86_64 and arm64:
On x86_64, the compiler treats fixed and variadic parameters the same, placing parameters in registers first and only using the stack when no more registers are available. On arm64, the compiler always places variadic parameters on the stack, regardless of whether registers are available. If you implement a function with fixed parameters, but redeclare it with variadic parameters, the mismatch causes unexpected behavior at runtime.
Source: Addressing Architectural Differences in Your macOS Code.
So if a method accepts a variable list of parameters (being a variadic function), all arguments will be passed on the stack. An example of such a function is objc_msgSend.
Post
Replies
Boosts
Views
Activity
Just like the other vulnerabilities in software used in macOS, this will most likely be fixed by updating to the newest version in the next major macOS 14 update.
Here’s a recent example when other OpenSSH vulnerabilities were fixed.
[quote='758102021, saradinushi, /thread/758102, /profile/saradinushi']
their reply was to get advise reaching out to Apple support
[/quote]
Hello, this is the Apple Developer forums. You might want to check out the Apple Support website and the Apple Support Community forums.
I'm not sure how your company data system works and how your iMessage thread appeared in a Console in SSH, but iMessage is end-to-end encrypted.
[quote='758285021, amiokumarsarkar, /thread/758285, /profile/amiokumarsarkar']
Reported this issue through feedback assistant.
[/quote]
It's great that you did that. I think it's good practice to also post the feedback numbers here for reference.
You might want to check out the instructions here for recovery mode or contact Apple Support.
For recovery mode, at the last step, keep holding the power button until you see the recovery mode screen.
For DFU mode (which isn't documented in the link above), timing is really important, so it can take a few tries to get it right.
I've been charging it for the whole day with different cables, bricks, and outlets. I'm gonna leave it overnight just to be sure.
If you charged it for more than an hour already, it should be more than enough.
Nope, no signs of life at all.
Maybe there could be a hardware issue as well, for example either a power or a screen issue. When you're following these steps, you should see the Apple logo on the screen at certain points - and if you don't, it's either a hardware issue or an important part of the boot process being broken (iBoot). If it's the latter, DFU mode should still work.
Here are some detailed explanations for that, if you're interested. I'll use some technical terms here - we're on the DevForums after all 😁.
iBoot is the stage 2 bootloader that loads iOS / Recovery Mode on your iPhone. It can be changed and upgraded. I believe the Apple logo is shown during this stage.
BootROM is the stage 1 bootloader that loads iBoot or DFU.
I think DFU can't break during updates, since it's essentially Boot ROM code waiting to be recovered via USB. So if you really cannot enter it at all, it's a clue that it could be a hardware issue.
Just a small note: " Recommended" replies are also affected (https://forums.developer.apple.com/forums/thread/757126).
I'm sorry to hear that. I hope you had a backup.
Does it show any signs of life at all?
It sounds like the phone might be low on battery, since nothing works. Just checking, did you try to plug it in for a while (for example 1-3 hours) and try again?
I can't enter recovery mode, DFU mode, reset, charge, nothing.
Keep in mind when the device is in DFU mode, nothing will be displayed on the screen. You'll need to connect it to a computer, and timing is important.
[quote='792483022, deverlof, /thread/757858?answerId=792483022#792483022, /profile/deverlof']
sudo cp modified_client.plist /var/db/locationd/clients.plist
sudo launchctl kickstart -k system/com.apple.locationd
[/quote]
Just wondering, I assume you had Full Disk Access enabled for Terminal? You shouldn't be able to do this unless Terminal has FDA. Without it, you'd get cp: /var/db/locationd/clients.plist: Operation not permitted.
I'm fairly sure the enclave key never leaves the enclave and therefore can't be intercepted unless the enclave itself was somehow compromised, but I think a compromised enclave and its implications would mean Apple's whole security model is broken.
Yeah that’s true, the key would be protected (I was talking about protecting process memory, not your idea of encrypting assets).
If you had kernel-level control over a jailbroken device, maybe you could leverage this and make the OS decrypt the process memory (if such APIs would exist in this hypothetical OS).
The attackers not being able to get a key doesn't matter if everything the key is meant to protect can be captured via memory.
I was talking about encrypting the process memory itself (the feature you said Apple did not implement), but yeah, it totally depends on how you would implement such a feature.
Moreover, enclave keys should only be a portion of all types of keys, with many needing to stay in memory--are they all vulnerable?
I was talking about a hypothetical scenario. Depends on how you use the keys in the enclave.
I don't think Apple security engineers are fools; I think we might be missing something if decrypted memory capture really is as simple as it sounds.
Yeah, well, to gain the OS rights required to read other processes’ memory, you need to bypass many security measures implemented in the OS already. It’s not that simple as it sounds.
[quote='791423022, wmk, /thread/757255?answerId=791423022#791423022, /profile/wmk']
Even if I store it in the enclave? My current system creates a unique key for each user-device combination, and uses that to encrypt assets at rest. The enclave key is said to never leave the enclave, so I'm convinced that this is the best I can do within the confines of the framework and platform.
[/quote]
Oh, well, that could work, but, as you said:
If someone can pull assets from memory then no amount of clever encryption will help unless the memory itself is encrypted
That is also true.
but this is one of the few security features that Apple hasn't implemented.
We are talking here about jailbroken devices - i.e. devices whose security features have been bypassed. There are already many security features that need to be bypassed in order to gain such control, and adding such a feature might slow down your device / be useless depending on how it's implemented (one already has broad access in a jailbroken phone, and that might include access to this hypothetical process memory decryption key). Maybe something can be done using the enclave, but remember that we're talking about process memory, and there are many processes on iOS.
the idea that apps can have a good first-mover's advantage
Yes, but also, even if it isn't the first one, there could be reasons why that app is better. First one might not necessarily be the best one.
Is there a way to tell which ones will be?
I'm not sure but I think most class names and public methods, because Swift and Objective-C include runtime type information. Sometimes also private methods might appear.
You can disassemble your file with and you'll see them.
[quote='791368022, wmk, /thread/757255?answerId=791368022#791368022, /profile/wmk']
Sorry, I meant decrypted binary. Basically, I read up on how people obtain and tweak IPAs, and the process seems to be to:
[/quote]
Sorry, my bad, I got confused - I think IPAs are not encrypted when you download iOS apps on a Mac (at least it seems so, but I need to check that). But yes, they are encrypted on iOS until runtime. You are right about that.
I encrypt all assets at rest, but decrypt them to load into memory when needed. I think I'm protected by this process, but If there's a way to pull the assets from memory nothing I do would help. Is this assumption true?
This is obfuscation - one can reverse this mechanism and obtain your asset decryption key depending on how this works. So I wouldn't say they are "protected", but rather "better hidden".
One more thing: as you said, an idea is to make it easier to remake the mechanism from scratch rather than copy your app's code. But I think you should also focus on branding your product so that even if clones / alternatives appear, yours will still be the best one, the one that people choose. Again, I'm not experienced in publishing apps yet, but this is what makes sense to me. It might be easier said than done.
I don't have an answer to all your questions, and I'm not an expert, so take this with a grain of salt.
Symbolization of the app
I think you mean "desymbolication", which removes some symbols from your app (like variable names).
Native builds from Xcode destroys names of variable, functions, etc.
Yes, though some class / method names will still be visible.
Swift code is compiled in such a way that makes stealing harder than Objective-C
It's a bit harder to reverse engineer right now - decompilation usually produces slightly more complicated code compared to Obj-C. Though it's not impossible.
The fact that iOS 18 supports a checkm8 (i.e. jailbreakable) device means that decrypting the IPA from memory is still trivial.
You can also download some iOS apps on macOS from the Mac App Store now (developers can opt out from allowing this). Which makes it easier to gain the IPA without jailbreaking.
Also, the IPA is not encrypted on your iPhone, but using such methods was needed to get the IPA, as you couldn't download iOS apps on devices other than iPhones.
People talk about stealing authentication secrets via reverse-engineering, but is the same true for mechanisms (i.e. code)?
Authentication secrets are meant to be secret, and they probably pose a greater danger than stealing mechanisms. Regarding code stealing, as long as the code runs on a device, I think saying that you can't completely hide it is a correct assumption. I'm not sure how common code stealing is.
Can machine learning be leveraged to make decompilation/reverse engineering easier?
You can ask generative ML models about what a portion of decompiled code might do and it might provide some good results. I think it could be leveraged and we might see some new reverse engineering tools using them in the future, but we'll have to wait.
iMessage is encrypted with new advanced algorithms and FaceTime is end-to-end encrypted as well.
I believe SMS and regular phone calls are not "encrypted" since they are well-established protocols and adding such features would break compatibility.
You can encrypt emails using PGP. Apple actually uses it for reporting security vulnerabilities to them.
I am looking to provide additional security for my binary
Who do you want to protect the binary from: any administrator user, root included (with Full Disk Access), or non-system apps (sandboxed or not) in general?
And what is the reason: protecting the binary from being modified, preventing the app's files from being read, or even preventing the binary itself from being read?
by utilizing a SIP-enabled location. Any SIP-protected directory would be suitable, with a preference for /usr/sbin.
The main purpose of SIP is ensuing the operating system's integrity. Your app is not part of the OS, so the OS is designed to prevent modifying SIP-protected locations.
If you were able to bypass these protections and place your own binaries at protected paths, again, it would represent a vulnerability that needs to be reported and fixed, since if defeats SIP's purpose.
So, I believe you cannot / should not be able to place your app in a SIP-protected directory without disabling SIP.
Alternatives to placing your binary in a SIP-protected location
There is a way to protect your app's files from being accessed and modified by other apps using SIP. Check out the App Sandbox part of the WWDC23 "What's new in privacy" session.
macOS Sequoia also has a new SIP feature for Group Containers.
Your app can also protected by the hardened runtime, which is required by default for your app to be notarized.
Generally speaking, if an app could modify locations protected by SIP, it would constitute a SIP bypass, which is a vulnerability. Locations protected by SIP aren't meant to be modified by apps.
Could you provide more details on what are you trying to achieve? What is the SIP-protected path you're referring to?