So it's resolving to these func's. But with only nil
literals, how can the compile infer the type to "Any"? Is the "Any" type include "no type"? I think the compiler should reject these expressions with "no type can be inferred" error.
nil
here is not Any
. It is Any.Type?
, which is a (optional of) meta type referring to any type that is a subtype of Any
, i.e., everything.
let a: Any = nil // error
let b: Any.Type? = nil // ok
let c: Any.Type? = Int.self // type of Int
So those overloads are applicable and fully specified.
I guess that's the most specific applicable overload. If no other applicable overload that is as specific or more, there's no ambiguity, and so Swift will just pick that.
They can also be compared to values with any positive number of additional layers:
Optional<Int>.none
can be compared to a variable with any positive number of additional "layers" of optionality, andnil
can be compared to a variable with any positive number of "layers" of optionality.
However there is a strange and unintuitive behavior here:
- When comparing
nil
literal, the runtime performs "greedy" implicit wrapping, assuming thenil
is always at the bottom layer of optionality (with the most possible layers of wrapping around it and noOptional.some()
cases). - However when comparing two variables with differing levels of optionality, the runtime performs "lazy" implicit wrapping, with at most one implicit wrap, which is always
Optional.some()
.
Now consider that:
- the benefit of optionals to programmers is they safeguard against treating a variable as if it had a value, when it might be nil
- yet optionals can masquerade as non-optionals when they are passed into a generic
- matters get very complicated once you are dealing with generics in the context of property wrappers, keypaths, collections, and optional stored properties whose default value might be nil
For example, consider the following scenario and code.
Lets say I'm going to throw a "white elephant" holiday party, where guests bring gifts, some of which are "joke gifts" that can be empty (or can be nested presents where the final present is empty) and some of which can have a present inside. All the gifts go into a sack, and people draw out presents at random.
(Using big text so it's clear where the code and comments are)
Holidays begin with these three protocols:
public protocol SantaSackable {
subscript<W: WhiteElephantParticipant, Gift>(_: WritableKeyPath<W, Gift>) -> Gift? { get set }
}
public protocol WhiteElephantParticipant {
var santaSack: SantaSackable { get }
}
public protocol Unwrapping {
associatedtype Gift
var `default`: Gift? { get set }
func gift<W: WhiteElephantParticipant>(instance: W, keyPath kp: WritableKeyPath<W, Gift>) -> Gift
}
The host has established a format for how the party will proceed. Each guest is represented by this class:
public final class Guest: WhiteElephantParticipant {
public var santaSack: SantaSackable = SantaSack()
@Unwrappable public var gift_1: Float? = 42
@Unwrappable public var gift_2: Float?
}
Note: To protect against a case where a guest could come away with two empty presents, a default gift of 42 is provided.
Now, since everyone loves getting Apple products as gifts, our host has devised this gift-unwrapping wrapper:
@propertyWrapper
public struct Unwrappable<Gift>: Unwrapping {
public typealias Gift = Gift
public var `default`: Gift?
public static subscript<U: Unwrapping, W: WhiteElephantParticipant>(
_enclosingInstance instance: W,
wrapped kp: WritableKeyPath<W, U.Gift>,
storage storageKeyPath: WritableKeyPath<W, U>
) -> U.Gift {
get {
let wrapper = instance[keyPath: storageKeyPath]
return wrapper.value(instance: instance, keyPath: kp)
}
set {
var wrapper = instance[keyPath: storageKeyPath]
wrapper.default = newValue
}
}
public func value<W: WhiteElephantParticipant>(instance: W, keyPath kp: WritableKeyPath<W, Gift>) -> Gift {
guard let value = instance.santaSack[kp] ?? `default` else {
fatalError("SantaSack \(instance.santaSack) is missing value for path \(kp) which has no default value.")
}
return value
}
@available(*, unavailable)
public var wrappedValue: Gift {
get { fatalError("called wrappedValue getter") }
set { fatalError("called wrappedValue setter") }
}
public init() {}
public init(wrappedValue: Gift) {
self.default = wrappedValue
}
}
Finally, each guest gets a SantaSack:
public final class Guest: WhiteElephantParticipant {
public var santaSack: SantaSackable = SantaSack()
@Unwrappable public var gift_1: Float? = 42
@Unwrappable public var gift_2: Float?
}
public class SantaSack: SantaSackable {
var gifts: [AnyKeyPath: Any] = [:]
public init() {}
public subscript<W: WhiteElephantParticipant, Gift>(key: WritableKeyPath<W, Gift>) -> Gift? {
get {
gifts[key as AnyKeyPath] as? Gift
}
set {
gifts[key as AnyKeyPath] = newValue
}
}
}
Now the party begins and the first guest gets to open presents.
let bob = Guest()
print(bob.gift_1) // prints nil instead of Optional(42)
print(bob.gift_2) // prints nil
Oh dear, Bob should have gotten at least one real present! But he got two empty presents.
How can we fix this? Well we can replace SantaSack with this:
public class SantaSack: SantaSackable {
var gifts: [AnyKeyPath: Any] = [:]
public init() {}
public subscript<W: WhiteElephantParticipant, Gift>(key: WritableKeyPath<W, Gift>) -> Gift? {
get {
gifts[key as AnyKeyPath] as? Gift? ?? nil // <= Mysterious fix!
}
set {
gifts[key as AnyKeyPath] = newValue
}
}
}
Now Bob gets 42, but the host is astonished.
@xwu I'm curious what your thoughts are on this situation.
Note that if we add this to the body of SantaSack's subscript's getter, we get:
let gift_1 = gifts[key as AnyKeyPath] as? Gift
let gift_2 = gifts[key as AnyKeyPath] as? Gift? ?? nil
print("\(Mirror(reflecting: gift_1)) — \(gift_1)")
// prints: Mirror for Optional<Optional<Float>> — Optional(nil)
print("\(Mirror(reflecting: gift_2)) — \(gift_2)")
// prints: Mirror for Optional<Optional<Float>> — nil
(There's probably a much simpler example but as this was the scenario in which this issue was encountered it seemed best to give the full original example that led me to post this thread, since I think my first attempt did not really capture the exact problem quite well enough.)
No. What’s involved here is the static type. The runtime is not doing any of this.
Yes, things get very complicated when you mix type erasure, dynamic casts, and static typing. Swift trusts users to use their judgment about how to combine powerful features instead of making those features less powerful. My thoughts are that you should not do this, regardless of what the result of such an expression would be.
You'd want to be more careful when using generics together with dynamic (runtime) casting as?
.
let a: Any? = nil // i.e. gifts[key as AnyKeyPath]
func XX<T>(_: T.Type) -> T? {
a as? T
}
func YY<T>(_: T.Type) -> T? {
a as! T?
}
print(XX(Float?.self)) // Optional(nil), i.e. Float??.some(nil)
print(YY(Float?.self)) // nil, i.e. Float??.nil
In the first one, Swift converts Any?.none
to Float?.none
, then wraps it in another optional layer, resulting in Float??.some(Float?.none)
.
In the second one, Swift again converts Any?.none
to Float??.none
.
The type is indeed static. T
is both known at compile time to be Float?
, but its interaction with as?
can be subtle when the generic parameter contains optional.
On an unrelated note:
You might want to make Unwrapping
refines AnyObject
(i.e. make Unwrapping
class-bound) because of this line here, but it's a very complicated system as you said. It's hard to tell.
Also, you don't need to explicitly annotate as AnyKeyPath
. You can just use gifts[key]
.
As that's done in a static function, Unwrapping
doesn't need to be class bound. However, WhiteElephantParticipant
might want to be class-bound, since _enclosingInstance
API actually passes ReferenceWritableKeyPath
. (That's why I made Guest
a class.) I wish this wasn't the case but we discussed that in another thread, and I'm told that since subscripts don't support inout
paramaters we're stuck with this limitation for now.
Thank you, good to know.
I'm still unclear though why it's beneficial for there to be two different treatments of nil
based on whether we're comparing a nil literal or a variable to which a nil literal was assigned. In what way does this behavior make Swift better/more powerful? Just trying to understand the rationale.
Note: Swift 5.3 seems to have introduced a compiler bug around this, considering the following scenario that we discovered today:
let a: Int? = nil // Optional<Int>.none
let b: Int?? = nil // Optional<Optional<Int>>.none
assert(a as! Int?? == b)
// - the above assertion PASSES in Xcode 12.0 playground (Swift 5.3) with compiler warning:
// "Forced cast from 'Int?' to 'Int??' always succeeds; did you mean to use 'as'?"
// - however the above assertion FAILS in Mac Playgrounds.app, Swift 5.1
assert(a as Int?? == b)
// - the above assertion FAILS in Xcode 12.0 playground (Swift 5.3)
// i.e. if you follow the compiler's suggestion, your code breaks
// - this assertion also FAILS in Mac Playgrounds.app, Swift 5.1
Regarding my long example in a prior post above, @xwu wrote:
Yes, things get very complicated when you mix type erasure, dynamic casts, and static typing. Swift trusts users to use their judgment about how to combine powerful features instead of making those features less powerful. My thoughts are that you should not do this, regardless of what the result of such an expression would be.
Well, I would tend towards a general design principle that, when powerful parts can be used together smoothly, it makes the overall product more powerful (not less).
For example, I'd like to use the powerful features of Swift together, without any of them misbehaving or leading to unexpected results. Generally speaking, failure under stress can be addressed as a design flaw.
Do you think it's impossible for there to be a design flaw in how nil
is currently handled? Is it the case that any time a user runs into a situation where the language behaves unexpectedly, then it's their fault for using it wrong?
I'd like to know why the current design is inherently better than the alternatives, which might include:
- throw an error whenever someone passes an optional value into a function argument whose type is a non-optional generic parameter, possibly based on an optional compiler flag or new symbol (at the level of the parameter or function)
- return true if any two optionals are compared where a
.none
case is nested at some level of either - provide a new unwrapping symbol (alternative to
!
or?
) that removes all levels of optional wrapping from a value, no matter how many there are (greedy unwrapper to match the behavior ofnil
literal assignment) - warn users anytime they are comparing two values that are of differing levels of optionality
- allow only one level of optionality at most, i.e. wrapping a value that's already optional does nothing
I'm not saying I think any of the above suggestions are the best solution but I'm curious what exactly the benefit of the current behavior is (for example why it's seen by some as more powerful). (I apologize in advance if some or all of these ideas have been discussed before, ad nauseam, but maybe those discussions did not fully consider the implications of KeyPaths and PropertyWrappers etc).
Thanks for any feedback.
Curious, I just tried it on playground. Ignoring the faulty getter, if Unwrapping
is struct, it won't get written back (anywhere). If it's class-bound, it's written to default
.
Take a look: Challenge: Flattening nested optionals
Nesting usually doesn't happen, unless you're using generics, at which point you're very well interested in each level of optionality. There's a difference between a dictionary not containing a value and containing nil
.
And to put a finer point on this, @1oo7, even in the use case you've described in this post where you have some sort of @Defaultable
wrapper, it may make perfect sense to use @Defaultable
with an optional-typed variable. Consider a situation like:
struct Foo {
@Defaultable(to: 42) var count: Int?
}
this describes a model where count
may take on a value of Int?.none
, but if the user doesn't set count
manually, it will default to 42
(i.e., Int?.some(42)
). If @Defaultable
were set up in such a way that it tried to look "through" optionals within its generic parameters, as I think you're suggesting, you'd end up with the following behavior:
var foo = Foo()
print(foo.count) // Optional(42)
foo.count = nil
print(foo.count) // Optional(42), huh??
Modeling the concept of "default to this value unless explicitly unset" is a perfectly reasonable model that would be impossible to represent under your proposed semantics (if I'm understanding you correctly).
As mentioned before, if dealing with nested optionals feels too confusing (it often is for myself), I highly recommend departing from Optional
and modeling relationships with your own domain-specific enum
, which will quash all of the implicit behavior of optionals and should make much clearer what's really going on.