The Type Inference of Integers thread reminded me of something a coworker was annoyed with today:
let green: CGFloat = 120/255 // 0.47
let g = CGFloat(120/255) // 0
Like in the other thread, we’re running afoul of Int
being the default literal — which I’m notably not arguing here, we can’t really change it and that might be OK.
Here, I find the issue more to be more one of unlabeled, overloaded initializers: I would’ve expected the full implementations of SE-0067 and SE-0104 to use the more Naming Guidelines-compliant init(truncating:)
pattern. init(_:)
being the “closest representable” is non-obvious to beginners and stays non-obvious even once you fully understand overload resolution and the literal protocols. It seems like we explicitly chose something outside of the Guidelines because we wanted C-style explicit conversion, but what we got out of it was a C-style distrust of integer literals.
These mismatches are the signs of a crufty language, and I know we can do better. Can we do better within the scope of source compatibility? I’m not sure. To me, even diagnosing on arbitrary losses of precision would be more acceptable than hiding a bug because of source stability.