Running through the tutorial on how to sign data using security.framework, I was trying to understand the format Apple is using & wanting for signatures (as this isn't documented anywhere): https://developer.apple.com/documentation/security/certificate_key_and_trust_services/keys/signing_and_verifying?language=objc
I've learned the format of the signatures are just ASN.1 objects, with EC signatures being a sequence of the R and S coordinates as ASN.1 integers.
However, I am noticing when using SecKeyCreateSignature
that either the R or S value will always be prepended with an extra byte.
For example:
30 45 02 20 66 B7 4C FB FC A0 26 E9 42 50 E8 B4
E3 A2 99 F1 8B A6 93 31 33 E8 7B 6F 95 D7 28 77
52 41 CC 28 02 21 00 E2 01 CB A1 4C AD 42 20 A2
^^ why is this here?
66 A5 94 F7 B2 2F 96 13 A8 C5 8B 35 C8 D5 72 A0
3D 41 81 90 3D 5A 91
This is a ASN.1 sequence, first is a 32-byte integer and second is a 33-byte integer. Why is that 00
byte being prepended to the integer? Why is it sometimes the R and sometimes the S?
Removing it causes SecKeyVerifySignature
to fail, so obviously it's required, but I need to know the logic here as I'm having to hand-craft these ASN.1 objects as all I have are the raw R and S values.