Default type inference of integers

Try writing typealias IntegerLiteralType = Double and then try your code again!

Integer types (Int specifically) are more common in the domains that Swift has been focused on, due to their use with collections and as counters, etc. You’re absolutely right that in other domains a different default might be more appropriate.


Great! Seems simple enough—I’m arguing we change the default to interpret as Double, and let people override to Int.

If we fundamentally agree that Swift should do the obvious thing first, and expose complexity only as needed, then it’s a no-brainer that the default behavior does normal math correctly. People who are depending on the details of Integer mathematics are, by definition, already more experienced programmers.

I would say the point for making this change is past. A LOT of code will start breaking if you change the default type for integer literals.

1 Like

The current behavior allows writing integers as well as double without having to specify the type
let x = 1
let x = 1.0
If both spellings would create a double, it would be a bit more complicated to create an int. You just have to learn to always use a dot for decimal numbers.


Not really. If you say you want to give out pairs of socks, and you bring one sock, if someone asks “how many pairs of socks can you give away”, the answer is 0 with 1 sock left over, not 1/2.

(in fact in the US im pretty sure fractions don’t even get taught until 3rd or 4th grade, so 0 would make perfect sense to little kids.)

In the “grade school” sense, floating point is not an ideal default either. For example, 1.0 - 0.9 - 0.1 does not equal 0.0.

In many Lisp dialects, there is a “rational number” data type, so that 1/2 is just 1/2, 2/6 is 1/3, etc. However, outside of academic examples rational numbers are actually not very useful. I think Swift’s approach is reasonable here.

Arguably, I think the most confusing behavior seen in this thread is that Float and Double conform to ExpressibleByIntegerLiteral. But changing that would be too big of a source break at this point.

1 Like

Yeah. Also, it’s pretty nice to be able to just write x = y / 2.

For what it’s worth, Haskell (which has similarly overloaded literals) makes Num (used for integer literals) be implied by Fractional (used for decimal literals). (I’m not sure how they handle precedence of defaulting.)

Would you also say that sqrt(-1) (which I think it returns NaN) is a “precision loss”? :slight_smile:

Re: sqrt(-1) returning NaN

Yes, absolutely it is precision loss. And it’s ridiculous that complex numbers haven’t been included in Swift at this point. In fact, I spent the weekend sketching a proposal to fix that, but am considering giving up on the whole thing, as it seems Swift is really an application development language only.

As preface to this whole thing, I asked my son the question and he answered exactly as you’d think: 1/2.

If you doubt this, go find your nearest spouse, relative, or friend who has no programming experience, present the code as written above, and ask them what the value of c should be. If you can find anyone who things that 1/2=0, I’ll be impressed (but not convinced).

As opposed to what kind of language? Swift bills itself as a systems language that can scale across the spectrum. It isn’t however a focused language like R that has tons of special workings for its domain.

That being said, I don’t know much about complex numbers and their business being a first class type.

Are you sure your example is correct? If I write 1/2 to the Python command line interpreter I get 0, not 0.5. Typing in your example:

Python 2.7.10 (default, Oct  6 2017, 22:29:07) 
[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.31)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> a = 1; b = 2
>>> c = a/b
>>> c

Also, note that Python does not use Double by default. It switches between an integer representation with unlimited size, a double, or a complex double dynamically as you do things with your numbers.

The general problem is there is no good representation that encompasses all we want to do with numbers. Double has its quirks too, they’re just different ones than for Int. And Double's quirks are also more subtle and easy to miss. For instance you shouldn’t count money with Double.


I am somewhat sympathetic to your position here, but the ship has sailed. Changing something as fundamental as the integer numeric model is not something that we can do with our source compatibility requirements.

I would really love to see complex numbers in Swift though! If you’re interested in this, please start a thread dedicated to them so other people who may be interested notice it.


Oh that’s interesting. I did try it in an interpreter online to confirm, but in my case it knew I wanted Double—that’s some pretty good inference! :slight_smile:

This was one the breaking changes between Python 2 and 3. In Python 3, / always does floating-point division, and // is the integer quotient operator. If one were going to design their own language, I would say that making division and quotient be spelled differently is a better design choice than following C or Python 2’s example of spelling them the same, so that 1/2 is either a type error or implicitly promotes to a fractional type, but alas, like Chris said it’s too late for us to seriously consider that for Swift.


Very helpful response, thank you.

So, the two different operators was, ultimately, going to be my suggestion, if others felt this was important. I was also thinking of a Rational type as an intermediate type that could preserve integer precision for as long as possible.

Is there maybe still a way to give users a set of operators that do the right thing mathematically? Currently, -,/,.squareRoot() can all do the “wrong” thing for some types. So, maybe we can develop a pathway using, for example, ÷ which actually takes (Int,Int)->(Double), or Rational.

If I were designing such a feature, I would *not* define a version of “÷” for Int. After all, when someone writes “let x = 1 ÷ 2”, we want the numerals to be interpreted as Double without any intermediate type conversion.

I might define another operator though, to calculate “quotient-and-remainder”, which returns a tuple.

i know people on here love unicode, but don’t you think by the time a new user discovers such a hard-to-type operator, they are probably already well aware of Swift’s integer type inference?

Also i think the general rule-of-thumb for operators is that they should generally return the same type as the operands.

Yes, I do think that, and I hate the ÷ symbol. I’m brainstorming solutions.

Eh, maybe? At the risk of picking nits, I’d say that the rule of thumb is more that you should be able to chain operators (“x = 1 + 2 / 3 - 7 * 4”), and it’s simply much harder to achieve that goal if the operators don’t all return to same type as their operands.

OTOH, many languages – including C – don’t support the kinds of overloading such a scheme would require… the distinction has been meaningless for large portions of the computer industry’s history. I’m not sure either interpretation is more right or wrong than the other.

Terms of Service

Privacy Policy

Cookie Policy