Ah, that makes sense. The explicit init didn't work because the literal type itself wasn't guaranteed as IntegerLiteralConvertible, and thus couldn't be reliably created from my 0 literal. Thanks!
The simpler "return 0" of course works, but what I'm trying to do right now is to expand my Swift skills beyond the practical "able to make stuff work" stage and hopefully get to "understand the language well enough to know why things happen the way they do." With Objective-C, I felt like I had a pretty solid mastery of the language and knew all of its its quirks, and since Swift appears to be the future, I'm trying to get a good handle on the ins and outs of its type system.
With that said, here's Round 2: integer initializers. As we all know, we are supposed to use initializers instead of casting to convert one size of integer to another, or to convert between signed and unsigned integers:
let foo: Int16 = 5
let bar: Int32 = Int32(foo)
let baz: UInt32 = UInt32(bar)
But why does this work? In the definition for the integer types, I see initializers that take other integers of the same type. For example, in UInt32's definition, I see this:
public init(_ value: UInt32)
However, I do not see initializers that accept integers of other types, such as Int, Int16, etc, nor do I see such initializers in the various protocols it implements. I *do* see toUIntMax() and init(_: UIntMax) methods in UnsignedIntegerType, as well as the equivalent methods taking and returning IntMax values in SignedIntegerType, so these methods could be used to convert between differently-sized integers of the samed signedness, but that won't get you from a signed integer to an unsigned integer or back (well, without using the magic initializers on U?IntMax). In IntegerArithmeticType, you have a toIntMax() method that could be used to convert an unsigned integer to a signed one, but since there doesn't seem to be any equivalent initializer taking an IntMax, that goes one way only.
I do notice that if I try to work on a generic IntegerType parameter, the magic casts are not in play. So wherever they are defined, it seems to be at a higher level than that:
func gimmeAZero<T: IntegerType>() -> T {
return T(0) // error: Argument labels '(_:)' do not match any available overloads
}
However, restricting it to either a signed or unsigned integer type works fine:
func gimmeAZero<T: SignedIntegerType>() -> T {
return T(0) // works fine
}
But, if we're trying to initialize it with something that's not a literal, it fails again:
func gimmeAZero<T: SignedIntegerType>() -> T {
let zero = 0
return T(zero) // error: Cannot invoke initializer for type 'T' with an argument list of type '(Int)'
}
So, why can I give something like an Int16 to UInt32's initializer? Where are these initializers defined? Why does it work?