Could/should type inference of literals be improved?

To clarify what I mean, let's consider this example from The Swift Book:

extension Double {
    var km: Double { return self * 1_000.0 }
    var m: Double { return self }
    var cm: Double { return self / 100.0 }
    var mm: Double { return self / 1_000.0 }
    var ft: Double { return self / 3.28084 }
}
let threeFeet = 3.ft
print("Three feet is \(threeFeet) meters")

As we'd like this to work for all BinaryFloatingPoint types, we change it to:

extension BinaryFloatingPoint {
    var km: Self { return self * 1_000.0 }
    var m: Self { return self }
    var cm: Self { return self / 100.0 }
    var mm: Self { return self / 1_000.0 }
    var ft: Self { return self / 3.28084 }
}
let threeFeet = 3.ft
print("Three feet is \(threeFeet) meters")

This seems to work as expected.

But if we change threeFeet to be something other than the "default" floating point type Double, eg Float, we get an error:

let threeFeet: Float = 3.ft // ERROR: Value of type 'Int' has no member 'ft'

I guess this is because Swift looks at the literal 3 in isolation, decides that it has to be its preferred interpretation of 3 ie Int (had the literal been 3.0, Swift would've decided that it was a Double).

So we have to write eg the following in order to make the type checker see what we might have expected it to see above:

let threeFeet = (3 as Float).ft

And this sort of defeats the point of having these extensions.

I've run into this limitation in a number of less contrived (and less pedagogical) real world scenarios.


Question:

Would it be possible, and worth it, to improve the type inference of literals in cases like this, so that Swift uses the context of the literal rather than looking at it in isolation?

2 Likes

Early versions of Swift in fact worked this way, allowing name lookup from a literal into any type conforming to that literal's protocol, but it made name lookup difficult, ambiguous, and expensive at compile time, since the compiler had to first find all known conforming types and then consider each of those types' members, along with those of all of the protocols they conform to, and so on.

3 Likes

Isn't it easier to just explicitly make it double by 3.0.ft? I personally dislike the implicit conversion of integers to floating point numbers and require in my code for all floating point numbers to have an explicit fraction. Not that I would want to start a debate on code style, it just seems easier than (3 as Double).ft. But you are correct that this works with just Double as 3.0 is implicitly a Double...

@Joe_Groff I'd expect this to work - not sure if it's reasonable:

let threeFeetFloat: Float = 3.0.ft
let threeFeetCGFloat: CGFloat = 3.0.ft

Making those work would still require bidirectional type inference through member lookups, and name lookup doing a global search, in order to propagate the type from the declaration backward through the member reference.

All this isn't to say that we couldn't bring some sort of similar functionality back in the future, though @Douglas_Gregor or @rudkx would be better judges of what that ought to look like. I agree in principle that it would be nice if literals could be given members that work polymorphically with literal type inference. Maybe we could allow member lookup into extensions on Expressible*Literal before literals are assigned concrete types?

Yes, I think that should be do-able and would be great to have. We should effectively be able to type-check this as if it had been written as:

func _my_generic_lambda<T : BinaryFloatingPoint>(_ x: T) -> T {
  return x.ft
}

let threeFeet: Float = _my_generic_lambda(3)

@Jens, can you file a JIRA?

3 Likes

SR-8571

2 Likes