[Request for Feedback] Partial Consumption of Fields of NonCopyable Types

You are correct. To do so, you would need to define a consuming method to get out the buffer and use discard. But you would still have the same sort of issue.

Keeping in mind that I was typing quickly and that was the source of that error... I am going to add to the original text above with an edit a real compilable and testable implementation of a FixedSizeQueue that we can use to guide our discussion. I think it will help avoid small mistakes like that in our discussion since any one of us can just compile the example while playing with it to discover small mistakes like that.

I actually had some text to add about discard but I forgot to add it to the original document. I will add it above and then ping the thread. Let's discuss this after that. Thank you for pointing this out.

Just to go over for anyone listening into this thread who may be unfamiliar with the situation Kavon is talking about, if one throws from a throwing init or one returns nil from a failable deinit, we clean up the rest of the type just as we would but we do not emit the deinit. For anyone who is interested in validating this, here is a small example:

class C {
    var k1: Klass 
    var k2: Klass

    init?() {
        k1 = Klass()

        if !bool {
            return nil
        }

        k2 = Klass()
    }
}

the -Onone sil we emit along the failure path is:

bb1:                                              // Preds: bb0
  %15 = ref_element_addr %0 : $C, #C.k1           // user: %16
  %16 = begin_access [deinit] [static] %15 : $*Klass // users: %18, %17
  destroy_addr %16 : $*Klass                      // id: %17
  end_access %16 : $*Klass                        // id: %18
  %19 = metatype $@thick C.Type                   // user: %20
  dealloc_partial_ref %0 : $C, %19 : $@thick C.Type // id: %20
  %21 = enum $Optional<C>, #Optional.none!enumelt // user: %22
  br bb3(%21 : $Optional<C>)                      // id: %22

which as one can see, involves us destroying just k1 and not k2 since it was not yet initialized. Then dealloc_partial_ref is invoked which just cleans up the memory without calling the deinit.

With that in hand, if I am understanding you correctly Kavon, I think what you are asking is can we make it so that we destroy the rest of the live parts of the type and not invoke the deinit like we do in the class example above.

While those semantics are implementable in the compiler, I think that it creates the wrong default behavior. Specifically off the top of my head:

  1. In the cases above, an init has not completely run so it is reasonable to say that a user cannot expect any type level contract/invariant like running a deinit to be enforced by the language. Once an object has been completely initialized our programming model should err on the side of the deinit running if it is provided due to that type level contract. That is an important invariant that users would expect by default to have unless there is something very explicit in the code like discard that signifies the break from the default. In contrast by making an assignment just cause the deinit to be turned off we have inserted into our programming model a very easy mistake to make that is also very hard to diagnose. Consider for instance how many assignment statements are in a large app and imagine a programmer having to track down which one of those specific assignment operations caused a deinit to be turned off. That is what is so great about discard! It makes turning off the deinit explicit in the code in an unmistakeable way that is easy to audit and understand. In the situation I described above, the programmer would just need to search through the program for all discard statements to determine what points the bad behavior occurred in.

  2. By doing this, we are assuming that whether or not a deinit should run is /always/ tightly coupled with the validity of a stored property on the type. Consider a simple situation where one has a deinit that calls a logging API when a type is destroyed. The deinit itself does not use any of the stored properties and just calls a logging API directly. I think it would be surprising to users that consuming a property of such a type would cause the deinit to not run.