Hmm, it's an interesting idea; I think it depends on how effectively it would optimise, for example:
let foo = 17 / 5
let bar = 5 + foo.quotient
Since foo.remainder is never used, this should optimise as:
let foo:Int = 17 / 5
let bar:Int = 5 + foo // tuple and .quotient are optimised away
We'd need someone more knowledge on the compiler to weigh in, but it seems like something that should be fairly easy to optimise, otherwise it would have a big impact on performance of any integer heavy operations.
The obvious question though is how much of a problem is this really?
I can't recall the last time I might have made a mistake like this, as if I want a float result I always make certain at least one of the operands is a floating point type, and with that in mind having to put .quotient all over the place could be burdensome rather than helpful.
I wonder if an alternative would simply be to have the compiler/type-checker identify ambiguity. For example:
let foo = 17 / 5 // warning: ambiguous expected type
print(foo) // this is no help, as it can take both int or float
let foo:Float = 17 / 5 // no warning: type is explicitly defined
print(17 / 5) // warning: ambiguous expected type
print(Float(17 / 5)) // no warning: type is explicitly defined
And so-on. Basically we'd have a warning in any case where type-inference cannot determine whether the desired type is integer or float, encouraging developers to specify an explicit type anywhere that there's a risk of a mistake. Of course this puts more of a burden on type-inference, which isn't always the most stable thing to begin with, but it seems like it might be a more elegant solution?
On 14 Feb 2017, at 00:03, Dan Stenmark via swift-evolution <email@example.com> wrote:
(I get the feeling the response to this pitch will be overwhelming negative, but *deep inhale* here I go!)
A common mistake I see programmers make is dividing two integers and expecting a floating-point result. This mostly affect new programmers who haven't learned about ALUs yet, but I sometimes even see veterans make the mistake when they don't realize that neither operand they're passing is floating-point.
let foo = 17 / 5
print( foo ) // Huh, why is this 3 and not 3.4? Oh, wait, I'm an idiot.
I'd like to propose we make '/' operator on two Ints return a quotient-remainder tuple by default. This should help both new and veteran programmers alike write less error-prone code.
let (quotient, remainder) = 17 / 5
print( "Q:\(quotient) R:\(remainder)" ) // Idiot-proof!