Re: type inference of literals vs. initializers

The Type Inference of Integers thread reminded me of something a coworker was annoyed with today:

let green: CGFloat = 120/255 // 0.47
let g = CGFloat(120/255) // 0

Like in the other thread, we’re running afoul of Int being the default literal — which I’m notably not arguing here, we can’t really change it and that might be OK.

Here, I find the issue more to be more one of unlabeled, overloaded initializers: I would’ve expected the full implementations of SE-0067 and SE-0104 to use the more Naming Guidelines-compliant init(truncating:) pattern. init(_:) being the “closest representable” is non-obvious to beginners and stays non-obvious even once you fully understand overload resolution and the literal protocols. It seems like we explicitly chose something outside of the Guidelines because we wanted C-style explicit conversion, but what we got out of it was a C-style distrust of integer literals.

These mismatches are the signs of a crufty language, and I know we can do better. Can we do better within the scope of source compatibility? I’m not sure. To me, even diagnosing on arbitrary losses of precision would be more acceptable than hiding a bug because of source stability.


This is probably related to this thread, though I can’t remember if this exact situation would be covered by the pitched solution there because of the division.

No, in @xedin’s pitched change we would not treat the second example here as we do the first.

Doing so would be a very substantial (and very surprising) breaking change for people who were expecting the integer division semantics.

1 Like

It would only be surprising to those who already know (and remember!) that numerics inside initializers aren’t treated the same as assignment. Once the change it made it seems likely anyone then familiar with the rules wouldn’t be confused, and anyone who forgot the previous rules would be happy it now works as they expect.

1 Like