Why do not actor-isolated properties support 'await' setter?

my personal feeling is that there was a lot of churn in “best practices” due to the order that Swift Concurrency features landed. this is probably a hyperbolic cartoon, but my experience in the past three years has often felt like driving a brand new Tesla that is missing its two rear wheels, leading us to spend a lot of time and effort researching how to do everything ‘the new way’ (new driving techniques? paving new roads for two-wheeled cars? attaching helicopter blades to the back?) and then adapting to the adaptations and ad infinitum.

perhaps the fault lies with us for being too enthusiastic about driving Teslas (because they are new and shiny and you really really want to believe in the dream), because if we had waited until the cars had all four wheels we might have had a dramatically different experience. but everything is obvious in hindsight.

i think every team has a different set of “rear wheels” but for me at least, i can name a few concrete examples:

  • Sendable conformance for AsyncStream
  • region-based isolation
  • atomics in the standard library
  • locks in the standard library
  • first-class backpressure support
  • better ordering guarantees around “submitting” work on an actor

code bases that were written against Swift compilers that lacked these features look radically different from what they would look like if Swift Concurrency supported these things from the start. for example, without RBI, everything needs to be Sendable, and this design constraint propagates through entire projects causing a lot of distant code to evolve to adapt to the different environment. i suspect when the Synchronization module lands, a lot of types that evolved into actors during the 5.5–5.10 era will have to “climb down” from that tree as well.

i doubt there would have ever been a realistic alternative to incrementalism. in that world, it might have been 2050 before Swift Concurrency shipped. however, in hindsight i think that leapfrogging “unstructured” technologies like locks and atomics until it became too obvious they were still needed was a mistake. Sendable and the compile-time checking around it should have waited until most of the other stuff landed.

i understand that noncopyable types were a big part of why the features landed in the order that they did, but ultimately coupling one very complex set of features - concurrency - to another complex and very-in-flux set of features - ownership - made Swift concurrency harder to adopt.

6 Likes

I apologize for how heavily this devolves from the original question of the thread, but I hope you let me indulge.

With regards to concurrency, there are very few industry experts worth listening to in the first place, and the content is typically their talks on languages other than Swift. Concurrency is a fairly "mathematical" subject in the sense of having to provide almost thesis-like proof of the logic, and no Swift blogger I've seen so far has done that; they mostly recite and compact the recent proposals and WWDC videos because that's how they get ad revenue — without providing any "mathematical" guidance on how to work with councurrency and how to implement the common patterns.

So while they exist and surely are red by many people, I think that the first point of action is that the community develops a habit of not consulting recitals and instead regards the proposal docs as the only source of truth, until better sources become available.

1 Like