Today I typed without giving it much thought the following and the compiler could not type-check the arrayAngle assignment (note that self.count is an Int):
import simd
...
let PI = Float.pi, PI_2 = PI / 2, first = -(PI_2 + PI / Float(self.count))
let arrayAngle: [simd_float1] = (0 ..< self.count).map {
first + 2 * PI * $0 / Float(self.count)
}
What’s going on here? I tried lots of tweaks and what finally did it was this:
let arrayAngle: [simd_float1] = (0 ..< self.count).map {
first + 2 * PI * Float($0) / Float(self.count)
}
That sounds like an awful lot of hoops to jump through. What am I doing wrong? Why is the compiler complaining about $0 but not about the other Int (2)?
Swift does not support heterogeneous arithmetic, so you cannot multiply PI (of type Float) by $0 (of type Int). However, because Swift is also extensible, you (or one of your dependencies) could theoretically have written an overload of * to make this computation possible, so the typechecker has to do a lot of work and gives up before it can tell you to write PI * Float($0).
There is no other Int. In Swift, literals such as 2 do not themselves have any type; in the context of the expression first + 2 * PI * Float($0) / Float(self.count), the typechecker assigns it type Float after (notionally) considering available ExpressibleByIntegerLiteral and ExpressibleByFloatLiteral types which would make this expression successfully typecheck.
(In the absence of type context (for example, let x = 42), the literal is assigned type IntegerLiteralType, which by default is a typealias for Int, but which you can override.)