Problems with Hex Unicode conversion in Swift

print(UInt16("0x2C",radix: 16)) print(UInt32("0x1F431",radix: 16)) I'm struggling with this for last 2 days and it's extremely annoying. I get fragment of the code from https://stackoverflow.com/questions/31284538/how-to-express-strings-in-swift-using-unicode-hexadecimal-values-utf-16 :

let value0: UInt8 = 0x43   // 97
let value1: UInt16 = 0x203C  // 22823
let value2: UInt32 = 0x1F431 // 127822

However

print(UInt8("43",radix: 16)) --> 67?
print(UInt32("1F431",radix: 16)) --> 128050???
print(UInt16("203C",radix: 16)) ---> 8252???

Does author cannot count and offers misleading information or I am missing something here? But even worse:

print(UInt16("2C",radix:16)) // gives 44
print(UInt16("0x2C", radix:16)) gives nil WTF
print(UInt32("1F432",radix: 16)) -->8252
print(UInt32("0x1F432",radix: 16)) --> nil WTF
print(UInt32("0x1F432")) --> also nil

I mean what's the reason behind this? I read in Apple Swift book that putting "0x" in front of string automatically infers the hexadecimal why is not the case?

Even even worse when working with UnicodeScalars if argument is nil then Xcode returns not informative message.

I neeed this as I rewrite someone else code from Typescript and looks it's just not possible in Swift to do things as elegant and simple as:

for (let i = str.length; i >= 1; i -= 2) {
        r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
    }

no equivalent for parseInt() function and no .fromCharCode method.

Shouldn't Int, UInt and their variants behave like parseInt()? why it's not possible to do:

print("\u{\(unicodetakenfromvariable)}")

Hex 0x43 is 67, not 97. Hex 0x203C is 8252, not 22823. :)

I’m not quite sure what you’re trying to do with Unicode code points and strings but I can answer this:

no equivalent for parseInt() function

Swift does this sort of thing with initialisers. There is no initialiser that takes either a hex or decimal value, but building one is pretty straightforward:

extension FixedWidthInteger {

    init?(hexOrDecimalString: String) {
        if hexOrDecimalString.starts(with: "0x") {
            self.init(hexOrDecimalString.dropFirst(2), radix: 16)
        } else {
            self.init(hexOrDecimalString, radix: 10)
        }
    }
}

With this you can write:

print(UInt32(hexOrDecimalString: "0x12"))   // Optional(18)
print(UInt32(hexOrDecimalString: "12"))     // Optional(12)

This works for any integer type that conforms to the FixedWidthInteger protocol, which covers “Int, UInt and their variants”.

Share and Enjoy

Quinn “The Eskimo!” @ Developer Technical Support @ Apple
let myEmail = "eskimo" + "1" + "@" + "apple.com"

Problems with Hex Unicode conversion in Swift
 
 
Q