Default type inference of integers

I am somewhat sympathetic to your position here, but the ship has sailed. Changing something as fundamental as the integer numeric model is not something that we can do with our source compatibility requirements.

I would really love to see complex numbers in Swift though! If you're interested in this, please start a thread dedicated to them so other people who may be interested notice it.

2 Likes

Oh that's interesting. I did try it in an interpreter online to confirm, but in my case it knew I wanted Double---that's some pretty good inference! :slight_smile:

This was one the breaking changes between Python 2 and 3. In Python 3, / always does floating-point division, and // is the integer quotient operator. If one were going to design their own language, I would say that making division and quotient be spelled differently is a better design choice than following C or Python 2's example of spelling them the same, so that 1/2 is either a type error or implicitly promotes to a fractional type, but alas, like Chris said it's too late for us to seriously consider that for Swift.

2 Likes

Very helpful response, thank you.

So, the two different operators was, ultimately, going to be my suggestion, if others felt this was important. I was also thinking of a Rational type as an intermediate type that could preserve integer precision for as long as possible.

Is there maybe still a way to give users a set of operators that do the right thing mathematically? Currently, -,/,.squareRoot() can all do the "wrong" thing for some types. So, maybe we can develop a pathway using, for example, á which actually takes (Int,Int)->(Double), or Rational.

If I were designing such a feature, I would *not* define a version of “÷” for Int. After all, when someone writes “let x = 1 ÷ 2”, we want the numerals to be interpreted as Double without any intermediate type conversion.

I might define another operator though, to calculate “quotient-and-remainder”, which returns a tuple.

i know people on here love unicode, but don’t you think by the time a new user discovers such a hard-to-type operator, they are probably already well aware of Swift’s integer type inference?

Also i think the general rule-of-thumb for operators is that they should generally return the same type as the operands.

Yes, I do think that, and I hate the á symbol. I'm brainstorming solutions.

Eh, maybe? At the risk of picking nits, I'd say that the rule of thumb is more that you should be able to chain operators ("x = 1 + 2 / 3 - 7 * 4"), and it's simply much harder to achieve that goal if the operators don't all return to same type as their operands.

OTOH, many languages -- including C -- don't support the kinds of overloading such a scheme would require... the distinction has been meaningless for large portions of the computer industry's history. I'm not sure either interpretation is more right or wrong than the other.