I created this pattern by accident, and it had some odd behavior, which made me more curious about it. Probably it can be explained by some aspect of ARC that I don’t yet grasp:
class Parent {
var child : Child!
func refreshChild() { child = Child(self) }
}
class Child {
weak var parent : Parent?
init(_ p : Parent?) { parent = p }
func triggerRefreshChild() {
parent?.refreshChild()
assert(parent?.child !== self)
print("point of interest",self)
}
}
let parent = Parent()
parent.refreshChild()
parent.child.triggerRefreshChild()
The idea is that a Parent holds the only strong reference to its Child, and the Child holds a weak reference to its Parent. Additionally, a Child can cause its Parent to instantiate a new Child, thereby dropping the retain count of the existing Child to zero.
At the line marked point of interest, what is the status of the Child object then known as self?
Can it be immediately deallocated, because its retain count is zero?
Or does the triggerRefreshChild method create an implicit strong reference to self that lasts until the method exits? I’m supposing the answer is yes, because IIRC child.triggerRefreshChild() is shorthand for Child.triggerRefreshChild(child)()
In the calling statement parent.child.triggerRefreshChild() the compiler ensures that the child reference is an owning reference before calling into triggerRefreshChild, and doesn't release it until after the call returns.
In general instance methods don't retain self on entry, as a performance requirement. Therefore, it's the compiler's job to make sure that something owns child during the call.
Not to be pedantic, but ackchyually, Swift only guarantees that the strong references last until their last usage, unlike C/C++ which guarantee their life until the end of the scope.
IIRC, until recently, most (all?) lifetimes did happen to last until the end of their containing scopes, but recent compiler optimizations were added that lets these more aggressively be deallocated earlier. This optimization is gated by the "Optimize Object Lifetimes" setting in Xcode (which maps onto SWIFT_OPTIMIZE_OBJECT_LIFETIME).
Sure, but the call I'm talking about is triggerRefreshChild(), not the function that contains this line of code. As soon as triggerRefreshChild() returns, the compiler is OK to release the reference — assuming that no subsequent code forces the compiler to keep it around longer.
Here’s the follow-up question: at the line marked point of interest, what is the status of the weak reference to parent? (Assume for the sake of the question that the reference count of parent never drops to zero)
My understanding is that it should remain valid, because the Parent object’s retain count is nonzero.
But this understanding seems to be incomplete. Because in practice (making a Mac OS GUI app) I’ve found that once the retain count of the surrounding Child object drops to zero, its weak reference to Parent becomes nil. If I want to continue to use parent, I have to explicitly upgrade it to a strong reference, say with
guard let strongParent = self.parent else { fatalError() }
Is there some detail of weak references that explains this behavior?
Strong references are free to be released after their last use, as @AlexanderM mentions. In this case, I believe that it's possible that the last "use" of parent in the outer scope occurs after the call to parent.child. Since parent isn't used in the call to triggerRefreshChild, I believe the compiler would be free to release parent before that call, which could cause the refcount to drop to zero if there are no other strong references.
An aside for the curious: auto-zeroing weak references are actually really tricky from an implementation perspective. Once the referenced object is deallocated, how do you know where the outstanding weak references to it are? You need to find them in order to nil them out.
Swift does this by actually implementing weak references as strong references... to a side table. This side table has a reference to the target object, and tracks whether it has been de-allocated.
Weak references themselves are nil-ed out lazily. I.e., they'll be non-nil until the first attempted access after the target object got deallocated. At that point, the weak ref is niled out, and the side table has its reference count decremented. Once every weak ref has been touched like this, the side table's RC goes to 0, and it itself gets deallocated.
I don't think this is something you can observe from Swift code (outside of unsafe pointer/bitcast/unmanaged APIs), but you might see it in the debugger.
they'll be non-nil until the first attempted access after the target object got deallocated.
Yes thanks, that makes sense. But the behavior I’m describing — which I’ve observed in my app — contravenes this rule of thumb, which is why I’m confused.
In my code, the Parent object that’s the target of the weak reference remains alive and never gets deallocated. (I can see it persisting in the debugger, but it’s an NSView that is visible on screen, so I really do know it’s alive!) Nevertheless, when the child object holding the reference reaches a retain count of zero, its weak reference to parent becomes nil.
This is weird to me, specifically because it inverts the usual intuition of how weak references work. But it happens consistently. So I have to conclude there is some subtlety of weak references that I’m missing.
(It’s possible there’s a dumb bug in my code, but the fact that introducing strongParent makes everything work suggests that it is a peculiar manifestation of the weak/strong dance.)
When child's retain count reaches zero, it is by definition deallocated. It's no longer an object. As part of the deallocation, I would expect the weak reference to be discarded, which probably ends up with the reference's pointer being set to nil. You might be able to "see" this by examining the dangling object reference in the debugger, I suppose.
You can't validly interrogate the properties on a deallocated object.
The WWDC ARC video persuaded me that everything that happens after the retain count drops to zero is emergent behavior that arises from the quiddities of the compiler and can’t be relied upon. Therefore, the best policy is to redesign class relationships to avoid this possibility.