Generic protocol initializers: Why doesn't this work?

Suppose I specify a generic function that returns something which I'm constraining to an IntegerLiteralConvertible, which is declared like this:


public protocol IntegerLiteralConvertible {
    typealias IntegerLiteralType
    /// Create an instance initialized to `value`.
    public init(integerLiteral value: Self.IntegerLiteralType)
}


If I try to use that initializer, which the compiler should know exists due to my parameterized type being IntegerLiteralConvertible, this happens:


func gimmeAZero<T: IntegerLiteralConvertible>() -> T {
    return T(integerLiteral: 0) // error: Cannot invoke initializer for type 'T' with an argument list of type '(integerLiteral: Int)'
}


Why is that?

Accepted Reply

I think you need to do something like the following to resolve the type for the compiler based on what I think you are trying to examine:

func gimmeAZero<T: IntegerLiteralConvertible where T.IntegerLiteralType:IntegerLiteralConvertible>() -> T {
  return T(integerLiteral: 0)
}


Of course you could just write it more simply as follows:

func gimmeAZero<T: IntegerLiteralConvertible>() -> T {
  return 0
}

Replies

I should add that an explicit T.init(integerLiteral: 0) results in the same error. Just returning 0 *does* work, but my goal here is to understand how the type system works a little better. What is the reason that calling the initializer explicitly does not work?

I thought the whole point of literal convertibles was *only* for assigning literals, such as in your "just return 0" example.

That may well be; the point, however, is to understand what about the declaration makes it so. I am trying to solidify my understanding of the type system and its workings.

Swift can't infer the type of the 0 literal in your example, and so always infers it to be an Int. You can see this if you do

func gimmeAZero<T: IntegerLiteralConvertible>() -> T {
  return T(integerLiteral: 0 as! T.IntegerLiteralType)
}

which will compile, but will crash on the failed cast if T is anything other than Int. It's probably an incomplete implementation but there could be an issue I'm not seeing.

I think you need to do something like the following to resolve the type for the compiler based on what I think you are trying to examine:

func gimmeAZero<T: IntegerLiteralConvertible where T.IntegerLiteralType:IntegerLiteralConvertible>() -> T {
  return T(integerLiteral: 0)
}


Of course you could just write it more simply as follows:

func gimmeAZero<T: IntegerLiteralConvertible>() -> T {
  return 0
}

Right, of course. There's nothing in the protocol that says the IntegerLiteralType has to be IntegerLiteralConvertible itself. Thanks.


Edit: Perhaps this isn't modelled in the protocol because of circularity issues. You want the IntegerLiteralType for an Int to be an Int, e.g. There's definitely some hidden machinery here though because, given the protocol definition you would expect

struct S: IntegerLiteralConvertible {
  init(integerLiteral value: String) {}
}

to work, but it gives the error

note: inferred type 'String' (by matching requirement 'init(integerLiteral:)') is invalid: does not conform to '_BuiltinIntegerLiteralConvertible'

Trying to go to the definition of _BuiltinIntegerLiteralConvertible leads nowhere.

Ah, that makes sense. The explicit init didn't work because the literal type itself wasn't guaranteed as IntegerLiteralConvertible, and thus couldn't be reliably created from my 0 literal. Thanks!


The simpler "return 0" of course works, but what I'm trying to do right now is to expand my Swift skills beyond the practical "able to make stuff work" stage and hopefully get to "understand the language well enough to know why things happen the way they do." With Objective-C, I felt like I had a pretty solid mastery of the language and knew all of its its quirks, and since Swift appears to be the future, I'm trying to get a good handle on the ins and outs of its type system.


With that said, here's Round 2: integer initializers. As we all know, we are supposed to use initializers instead of casting to convert one size of integer to another, or to convert between signed and unsigned integers:


let foo: Int16 = 5
let bar: Int32 = Int32(foo)
let baz: UInt32 = UInt32(bar)


But why does this work? In the definition for the integer types, I see initializers that take other integers of the same type. For example, in UInt32's definition, I see this:


public init(_ value: UInt32)


However, I do not see initializers that accept integers of other types, such as Int, Int16, etc, nor do I see such initializers in the various protocols it implements. I *do* see toUIntMax() and init(_: UIntMax) methods in UnsignedIntegerType, as well as the equivalent methods taking and returning IntMax values in SignedIntegerType, so these methods could be used to convert between differently-sized integers of the samed signedness, but that won't get you from a signed integer to an unsigned integer or back (well, without using the magic initializers on U?IntMax). In IntegerArithmeticType, you have a toIntMax() method that could be used to convert an unsigned integer to a signed one, but since there doesn't seem to be any equivalent initializer taking an IntMax, that goes one way only.


I do notice that if I try to work on a generic IntegerType parameter, the magic casts are not in play. So wherever they are defined, it seems to be at a higher level than that:


func gimmeAZero<T: IntegerType>() -> T {
    return T(0) // error: Argument labels '(_:)' do not match any available overloads
}


However, restricting it to either a signed or unsigned integer type works fine:


func gimmeAZero<T: SignedIntegerType>() -> T {
    return T(0) // works fine
}


But, if we're trying to initialize it with something that's not a literal, it fails again:


func gimmeAZero<T: SignedIntegerType>() -> T {
    let zero = 0

    return T(zero) // error: Cannot invoke initializer for type 'T' with an argument list of type '(Int)'
}


So, why can I give something like an Int16 to UInt32's initializer? Where are these initializers defined? Why does it work?

You just didn't scroll down far enough in the standard library.

extension UInt32 {
    public init(_ v: UInt8)
    public init(_ v: Int8)
    public init(_ v: UInt16)
    public init(_ v: Int16)
    public init(_ v: Int32)
    public init(_ v: UInt64)
   // …
}

Oh God, it would be something simple like that. :-D Sheesh, how did I miss that? Well, that was simple.


I do have to say that it's a bit disappointing that they didn't turn out to be in some protocol somewhere that I could use that would allow me to constrain T to that protocol and have my useless little function return a zero value in any valid integer type, but ah well, I guess that could be done easily enough by defining one's own protocol and making all the individual integer types conform to it in extensions. Kind of a lot of code for the same sort of thing that Obj-C could have done more succinctly with a -respondsToSelector: test, and it doesn't cover additional integer types that could be added in the future, but that's the nature of the *****, I guess.