Tiny puzzle: Initializing differently

Can you find a type T and an expression expr such that both of the following lines are valid (and run without crashing), but they produce different values?

let x: T = expr
let x = T(expr)

(This is not a question about randomness or mutable state—the two lines should have fundamentally different results.)

1 Like

Sure! Are you asking or are you telling?

let x: Int?? = .none
let y = Int??(.none)
print(x, y) // nil Optional(nil)
struct CopyableThing {
  var id = UUID()
  var label: String
extension CopyableThing {
  init(_ other: CopyableThing) {
    self = .init(label: other.label + "!")

let x: CopyableThing = CopyableThing(label: "hi")
let y = CopyableThing(CopyableThing(label: "hi"))

That said, I know there are some more frustrating cases out there.

class Foo 
    init(_ bar:Bar) 
class Bar:Foo 
    override init() 

let foo:Foo = Bar.init() 
let bar = Foo(Bar.init())
print(foo, bar)
$ swift puzzle.swift 
puzzle.Bar puzzle.Foo

this, along with compile times, is a reason to use type annotations.


Those are some great answers!

• • •

Well, the example that inspired this thread was:

let x: Double = 1/3    // 0.333...
let x = Double(1/3)    // 0.0

That might make a good warning. I can’t imagine there’s a case where a programmer wants to call Double.init with the result of dividing two integers. If they really want to do that, we could suppress the warning if they wrap the division in Int or explicitly coerce the expression as Int.

1 Like

i think what @Nevin is getting at is that Double(1) and Double(3) both coerce their arguments to Double.init(1.0) and Double.init(3.0) (which, for numeric literals only, are no-ops), but Double(1/3) does not coerce the inner syntax node to Double.init(1.0 / 3.0), because that would conflict with the initializer that takes a BinaryInteger.


That’s because Swift doesn’t have an ExpressibleByIntegerLiteralDividedByIntegerLiteral protocol. I don’t think Double should have this behavior, because it could extend arbitrarily deep. That’s why I suggested a warning for this common mistake.

Another possibility would be to not infer number literals like "1" or "3" as float types, only int types (like in C). This would make a slightly different version of swift. Obviously this is not going to happen.

I reported a bug:

struct S {
	init(_ v: Double) {}
func foo(_ v: S) {}
func bar(_ v: [S]) {}
let s: S = .init(1/2)   // ok
foo(.init(1/2))         // ok
bar([S(1/2)])           // ok
bar([.init(1/2)]) // Cannot convert value of type 'Int' to expected argument type 'Double'

all those 1/2 is actually end up equal 0.5.

it's very confusing.