Default type inference of integers

It seems to me that Swift isn't doing 'The Right Thing' in a very simple mathematical test,

In Python,

a = 2; b = 0.5
c = a*b # 0.5

In Swift,

let a = 2, b = 0.5
let c = a*b // Error!

Similarly,

a = 1; b = 2
c = a/b # c=0.5

while in Swift,

let a = 1, b = 2
let c = a/b // c = 0

This strikes me as a problem. I have heard many times that Swift should be safe, expressive, and complexity should be discoverable. This principle should apply here as well: 1 divided by 2 = 0.5 to most people. Another 'discoverable' level of complexity is learning that if you define these types as Ints, then you get a different result.

So, I would argue that the default behavior should be for Swift to adhere to normal/expected mathematics---in other words, the inferred type should be Double. I also think this is pivotal if Swift is ever going to have any footprint in science and data analysis.

Thoughts?

I would say not. What you're asking for is automatic conversion between numeric types. That has inherently unsafe aspects (precision loss). That's why Swift requires you to explicitly convert between numeric types. I will say this a bit annoying part of Swift, but one I can live with after hearing the arguments for it.

6 Likes

It is not about inference though. The operators are defined to take and return Self.

What types are a and b in your python example? And exactly when do you want to "infer" Double?

There are a couple of points here. Firstly, for the example:

let a = 1, b = 2
let c = a/b // 0

That's definitely correct behaviour, since that's how integer division operates. I don't think it's a high barrier to ask that someone type:

let a = 1.0, b = 2.0
let c = a/b // 0.5

if they want to have the types inferred as Doubles.

For your other example, though,

let a = 2, b = 0.5
let c = a*b // Error!

There's actually something that could be done about that, but it would make the language more complex and the compiler developers probably wouldn't thank you for it. You'll notice if you instead type:

let c = 2 * 0.5

it works, since it can infer that 2 should be a Double in this context because of the Self requirements of the * operator. However, types are resolved on a per-line basis, so once a is resolved to be an Int it can't be changed. In theory, you could delay type resolution of ExpressibleBy_Literal types until they're first used in an expression. I very much don't think that's worth the complexity, though.

2 Likes

Yes, this would require us to introduce global type inference, which in theory could let you get rid most of the type annotations that Swift currently requires. But this would be a pretty big lift for the compiler. And you lose the compiler enforced requirement that functions explicitly annotate their types, and instead would rely on best practice in the community to annotate.

I think what Swift currently does is the best option. The other option of a global type analysis has bad downsides including:

  1. It would really slow the compiler down (think Scala here).
  2. The error messages would be difficult to decipher (think Haskell here).
1 Like

1/2 = 0 is a pretty big "precision loss".

If you're a programmer that specifically needs integer type behavior, then great---explicitly specify that type!!

The point is that integer mathematics is a higher level of complexity, and so it's okay to write the extra line of code, e.g., let a:Int = 1, etc. in order to get the behavior you want. But it's bad to have 1/2 = 0 by default.

Why is the current behavior best? Defend that.

The point is this:
You're walking into a 3rd grade classroom, trying to convince them to learn programming with your language, Swift. You will completely lose them (and their teacher) when they learn that your language spits out 1/2 = 0. So much for world domination.

Try writing typealias IntegerLiteralType = Double and then try your code again!

Integer types (Int specifically) are more common in the domains that Swift has been focused on, due to their use with collections and as counters, etc. You're absolutely right that in other domains a different default might be more appropriate.

4 Likes

Great! Seems simple enough---I'm arguing we change the default to interpret as Double, and let people override to Int.

If we fundamentally agree that Swift should do the obvious thing first, and expose complexity only as needed, then it's a no-brainer that the default behavior does normal math correctly. People who are depending on the details of Integer mathematics are, by definition, already more experienced programmers.

I would say the point for making this change is past. A LOT of code will start breaking if you change the default type for integer literals.

1 Like

The current behavior allows writing integers as well as double without having to specify the type
let x = 1
let x = 1.0
If both spellings would create a double, it would be a bit more complicated to create an int. You just have to learn to always use a dot for decimal numbers.

3 Likes

Not really. If you say you want to give out pairs of socks, and you bring one sock, if someone asks “how many pairs of socks can you give away”, the answer is 0 with 1 sock left over, not 1/2.

(in fact in the US im pretty sure fractions don’t even get taught until 3rd or 4th grade, so 0 would make perfect sense to little kids.)

In the "grade school" sense, floating point is not an ideal default either. For example, 1.0 - 0.9 - 0.1 does not equal 0.0.

In many Lisp dialects, there is a "rational number" data type, so that 1/2 is just 1/2, 2/6 is 1/3, etc. However, outside of academic examples rational numbers are actually not very useful. I think Swift's approach is reasonable here.

Arguably, I think the most confusing behavior seen in this thread is that Float and Double conform to ExpressibleByIntegerLiteral. But changing that would be too big of a source break at this point.

1 Like

Yeah. Also, it's pretty nice to be able to just write x = y / 2.

For what it's worth, Haskell (which has similarly overloaded literals) makes Num (used for integer literals) be implied by Fractional (used for decimal literals). (I'm not sure how they handle precedence of defaulting.)

Would you also say that sqrt(-1) (which I think it returns NaN) is a "precision loss"? :slight_smile:

Re: sqrt(-1) returning NaN

Yes, absolutely it is precision loss. And it's ridiculous that complex numbers haven't been included in Swift at this point. In fact, I spent the weekend sketching a proposal to fix that, but am considering giving up on the whole thing, as it seems Swift is really an application development language only.

As preface to this whole thing, I asked my son the question and he answered exactly as you'd think: 1/2.

If you doubt this, go find your nearest spouse, relative, or friend who has no programming experience, present the code as written above, and ask them what the value of c should be. If you can find anyone who things that 1/2=0, I'll be impressed (but not convinced).

As opposed to what kind of language? Swift bills itself as a systems language that can scale across the spectrum. It isn’t however a focused language like R that has tons of special workings for its domain.

That being said, I don’t know much about complex numbers and their business being a first class type.

Are you sure your example is correct? If I write 1/2 to the Python command line interpreter I get 0, not 0.5. Typing in your example:

Python 2.7.10 (default, Oct  6 2017, 22:29:07) 
[GCC 4.2.1 Compatible Apple LLVM 9.0.0 (clang-900.0.31)] on darwin
Type "help", "copyright", "credits" or "license" for more information.
>>> a = 1; b = 2
>>> c = a/b
>>> c
0

Also, note that Python does not use Double by default. It switches between an integer representation with unlimited size, a double, or a complex double dynamically as you do things with your numbers.

The general problem is there is no good representation that encompasses all we want to do with numbers. Double has its quirks too, they're just different ones than for Int. And Double's quirks are also more subtle and easy to miss. For instance you shouldn't count money with Double.

2 Likes