Interesting (beneficial) behavior of `if let` when casting `nil` to `Optional` Type changed with Swift 5. Was it intentional?

I have been taking advantage of an interesting if let pattern with Swift 4.2 that is broken in Swift 5. I am not sure if this is a bug or an intended change in behavior. My goal is to determine whether a given generic Type is Optional or not. Actually, more specifically, I want to store nil as the generic Type if the generic Type is Optional so being able to cast nil to that Type is what makes this pattern valuable. Under Swift 4.2, the following code will print

Bool
» failure
Optional<Bool>
» success

Under Swift 5 the code will print

Bool
» failure
Optional<Bool>
» failure

Here is the code:

let anyNil: Any? = nil

func test<T>(_ type: T.Type) {
	print("\(T.self)")

	if let _ = anyNil as? T {
		print("» success")
	} else {
		print("» failure")
	}
}

test(Bool.self)
test(Bool?.self)

Is this the same case that's called out in the Xcode 10.2/Swift 5 release notes?

In Swift 5 mode, when casting an optional value to a generic placeholder type, the compiler will be more conservative with the unwrapping of the value. The result of such a cast now more closely matches the result you would get in a nongeneric context. (SR-4248) (47326318)

For example:

func forceCast<U>(_ value: Any?, to type: U.Type) -> U {
    return value as! U 
} 

let value: Any? = 42
print(forceCast(value, to: Any.self))
// Prints "Optional(42)"
// (Prior to Swift 5, this would print "42".)

print(value as! Any)
// Prints "Optional(42)"
1 Like

I had a similar problem in XMLCoder, even though it targets Swift 4.2. I was able to work around this with a few hacky metatypes:

protocol AnyOptional {
}

extension Optional: AnyOptional {
}

Your function test would then look like this:

func test<T>(_ type: T.Type) {
	print("\(T.self)")

	if type is AnyOptional.Type {
		print("» success")
	} else {
		print("» failure")
	}
}

Interesting! This looks like a regression in the runtime's dynamic casting machinery to me. The Swift 4.2 behaviour is correct due to the fact that Optional<T>.none can be cast to Optional<U>.none. This does appear to be fixed on master (i.e you once again get the 4.2 behaviour), but we should add a test case – I've filed SR-9837 to track that.

@ole That change was also present in 4.2 (it just wasn't properly guarded by -swift-version 5), so is unrelated to this regression as far as I can tell.

1 Like

Thanks for the suggestion! I had a similar thought late last night. In my actual use-case I need to store nil as the generic type so I initially thought a protocol like yours was not going to work because I cannot use it if I give my protocol an associated type. Then I realized that I can get nil initialization of an Optional as part of a similar "hacky" protocol without the associated type. At any rate, I'll wait for a fix if this is deemed a bug in the end!

1 Like

Thanks! Didn't have the confidence that it was actually a bug but I will be stoked to get to keep using the pattern. Ever-so-slightly cleaner than an internal protocol conformance workaround IMO.

Terms of Service

Privacy Policy

Cookie Policy