I believe the issue is that to be fully automatic, the compiler also has to be more conservative. The various attributes and pseudo-functions being proposed are to direct the compiler towards a less-conservative choice, while also guaranteeing consistency via the use of explicit code.
Of course this particular example can be optimized. But the compiler either needs to know something about the Array type or about the side effects of Array.sort. Wrapping the array in an existential and wrapping the sort in an opaque function could defeat optimization. The very small minority of programmers who need a strict uniqueness guarantees should be able to express that intention independent from the capabilities of the optimizer.
I used errno as an example, because although it is ancient, it is essential to working with C and POSIX APIs, and the problem of runtime calls clobbering it is pervasive. You could imagine similar problems arising anywhere there's shared state between deinits and other code, though, in a model that places absolutely no constraints on where value lifetimes can theoretically be ended.
One thing to note is that we do still want to have full freedom to shorten lifetimes for "value types" and other well-behaved values, whose deinitializers have no side effects beyond releasing memory, and are never weak-referenced, pointed into except through limited-access APIs like withUnsafeBufferPointer, or interact with ObjC esoterica like associated objects. No matter what model we choose as a baseline for general value lifetimes, for values of types known to have this "default-deinit-behavior" property, we should be able to do better, since the shortening of such values' lifetimes should not be observable by other code in the program. Such conditional behavior has limitations of course, since we have to take the conservative tack through abstractions that aren't limited to well-behaved types, but you should ultimately have the ability to say you're living within that constrained well-behaved world.
This seems pretty unrealistic to me. I don't see how the compiler can ever know—without whole program analysis—that a closure or an Any or a [Widget] or an Error or my hand-written CoW collection (where I manipulate memory in a ManagedBuffer) is not keeping some class instance with a nontrivial deinit alive. In fact in the latter case, it is doing that. It's just that its deinit ordering probably doesn't matter.
If it's modeled like a protocol, then Widget can be declared to conform to DefaultDeinit, and Array: DefaultDeinit where Element: DefaultDeinit, you can use any DefaultDeinit instead of Any, or any DefaultDeinit & Error instead of Error, and so on.
Sounds like exactly the kind of mandatory boilerplate and language complexity I've been worried about. 99% of all types—even classes!—represent something whose deinit ordering shouldn't matter (as long as its after the last use). You'll need to explain this protocol to people and the language's different behaviors for different types, all because of obscure situations involving weak references.
Also, DefaultDeinit is clearly not the right name for it, because my ManagedBuffer doesn't have a default deinit. So what property is this protocol actually modeling? That needs to be clearly describable for it to be a foundational thing in the language.
I am so looking forward to inout and ref. Especially destructuring enums like this.
Thanks for pointing me to this! I would like to ask whether some of these are implemented? I've seen behavior changes in Swift 5.10 closely matches the "default to consuming" behavior What's new about lifetime in Swift 5.10? - #3 by liuliu and would like to get more guidance on the proper fix on my end.