print(UInt16("0x2C",radix: 16)) print(UInt32("0x1F431",radix: 16)) I'm struggling with this for last 2 days and it's extremely annoying. I get fragment of the code from https://stackoverflow.com/questions/31284538/how-to-express-strings-in-swift-using-unicode-hexadecimal-values-utf-16 :
let value0: UInt8 = 0x43 // 97
let value1: UInt16 = 0x203C // 22823
let value2: UInt32 = 0x1F431 // 127822
However
print(UInt8("43",radix: 16)) --> 67?
print(UInt32("1F431",radix: 16)) --> 128050???
print(UInt16("203C",radix: 16)) ---> 8252???
Does author cannot count and offers misleading information or I am missing something here? But even worse:
print(UInt16("2C",radix:16)) // gives 44
print(UInt16("0x2C", radix:16)) gives nil WTF
print(UInt32("1F432",radix: 16)) -->8252
print(UInt32("0x1F432",radix: 16)) --> nil WTF
print(UInt32("0x1F432")) --> also nil
I mean what's the reason behind this? I read in Apple Swift book that putting "0x" in front of string automatically infers the hexadecimal why is not the case?
Even even worse when working with UnicodeScalars if argument is nil then Xcode returns not informative message.
I neeed this as I rewrite someone else code from Typescript and looks it's just not possible in Swift to do things as elegant and simple as:
for (let i = str.length; i >= 1; i -= 2) {
r = String.fromCharCode(parseInt("0x" + str.substring(i - 2, i))) + r;
}
no equivalent for parseInt()
function and no .fromCharCode
method.
Shouldn't Int, UInt and their variants behave like parseInt()? why it's not possible to do:
print("\u{\(unicodetakenfromvariable)}")