Can someone help clarify why the first x in let y = f(x), x = 100 seems to default to zero? Also, can you point me towards a source which will help me understand this behavior?
let f = { (x: Int) in x}
// I can understand why
let a = 100, b = f(a)
print(a, b) // 100 100
// I do not understand how x got an implicit zero
let y = f(x), x = 100
print(x, y) // 100 0
Edit:
It seems it always defaults to some implicit value.
let b = a, a = 100
print(a, b) // 100 0
let y = x, x = true
print(x, y) // true false
struct Something {var x: Int = 50, y: String = "default value"}
let s2 = s1, s1 = Something(x: 100, y: "some string")
print(s1, s2) // Something(x: 100, y: "some string") Something(x: 0, y: "")
I think it'd require a new language mode to fix. Variables in top level code probably should have been private by default, and not allowed to be used before they were defined.
Yeah, it's more linear than Swift usually is, but top level code is already more linear than normal, so I think it would have been natural in practice.