There's a benefit of this alternative that I think is worth mentioning: it would be closer to the behavior of deinits. The body of a deinit has access to the stored properties of a value, but suppresses the implicit call to the value's deinitializer. The behavior of deinit could be explained as: a deinit behaves as if it had an implicit discard self at the beginning. And the behavior of discard self could be explained as: after a discard self statement is executed, the method behaves as if it were a deinit.
It would have the wrinkle that with Mutation and consumption in non-`Copyable` type `deinit`s, we want to allow a deinit to transfer ownership of self in order to delegate its destruction. In order to do that, we need a way to "cancel" the implicit discard self in a deinit. But having an analogous feature for "cancelling" an explicit discard self within a method, such that ownership of self could be transferred after an explicit discard self, would add additional complexity and unintuitiveness.
With this pitch as-is, the behavior of deinit could be explained in another way: a deinit behaves as if it had an implicit discard self at the end[1] (unless ownership of self was transferred beforehand). That works, but it creates an inconsistency: while deinits implicitly call discard self as late as possible, we would likely encourage consuming methods to call discard self as early as possible in order to minimize control flow issues (for example, SE-0390's FileDescriptor.close() example).
This inconsistency would likely manifest in consuming methods following a specific pattern that sets them apart from deinits: first, read/consume any stored properties that will needed later (such as a file descriptor); second, call discard self; third, release the manually-managed resources owned by self (such as closing the file descriptor).
I suppose there's another way to solve said inconsistency while keeping the behavior of this pitch as-is: allowing people to write defer { discard self }. Then, a deinit would behave as if it had an implicit defer { discard self } at the beginning; and after a defer { discard self } is executed, a method behaves as if it were a deinit. This would allow people to write consuming methods that "act like" deinits without needing to follow the elaborate pattern described above.
and before any early exits ↩︎