Nevin
1
Can you find a type T and an expression expr such that both of the following lines are valid (and run without crashing), but they produce different values?
let x: T = expr
let x = T(expr)
(This is not a question about randomness or mutable state—the two lines should have fundamentally different results.)
1 Like
xwu
(Xiaodi Wu)
2
Sure! Are you asking or are you telling?
let x: Int?? = .none
let y = Int??(.none)
print(x, y) // nil Optional(nil)
4 Likes
jrose
(Jordan Rose)
3
struct CopyableThing {
var id = UUID()
var label: String
}
extension CopyableThing {
init(_ other: CopyableThing) {
self = .init(label: other.label + "!")
}
}
let x: CopyableThing = CopyableThing(label: "hi")
let y = CopyableThing(CopyableThing(label: "hi"))
That said, I know there are some more frustrating cases out there.
5 Likes
class Foo
{
init()
{
}
init(_ bar:Bar)
{
}
}
class Bar:Foo
{
override init()
{
super.init()
}
}
let foo:Foo = Bar.init()
let bar = Foo(Bar.init())
print(foo, bar)
$ swift puzzle.swift
puzzle.Bar puzzle.Foo
this, along with compile times, is a reason to use type annotations.
2 Likes
Nevin
5
Those are some great answers!
• • •
Well, the example that inspired this thread was:
let x: Double = 1/3 // 0.333...
let x = Double(1/3) // 0.0
5 Likes
ksluder
(Kyle Sluder)
6
That might make a good warning. I can’t imagine there’s a case where a programmer wants to call Double.init with the result of dividing two integers. If they really want to do that, we could suppress the warning if they wrap the division in Int or explicitly coerce the expression as Int.
1 Like
i think what @Nevin is getting at is that Double(1) and Double(3) both coerce their arguments to Double.init(1.0) and Double.init(3.0) (which, for numeric literals only, are no-ops), but Double(1/3) does not coerce the inner syntax node to Double.init(1.0 / 3.0), because that would conflict with the initializer that takes a BinaryInteger.
2 Likes
ksluder
(Kyle Sluder)
8
That’s because Swift doesn’t have an ExpressibleByIntegerLiteralDividedByIntegerLiteral protocol. I don’t think Double should have this behavior, because it could extend arbitrarily deep. That’s why I suggested a warning for this common mistake.
tera
9
Another possibility would be to not infer number literals like "1" or "3" as float types, only int types (like in C). This would make a slightly different version of swift. Obviously this is not going to happen.
young
(rtSwift)
10
I reported a bug:
struct S {
init(_ v: Double) {}
}
func foo(_ v: S) {}
func bar(_ v: [S]) {}
let s: S = .init(1/2) // ok
foo(.init(1/2)) // ok
bar([S(1/2)]) // ok
bar([.init(1/2)]) // Cannot convert value of type 'Int' to expected argument type 'Double'
all those 1/2 is actually end up equal 0.5.
it's very confusing.