`nonisolated lazy let` on an actor

I've been toying around with an idea to reverse the polarity of nesting an actor into a @MainActor protected class with the ability to hide the class object.

There are some additional goals here:

  • the actor could be initialized from anywhere, so its init is nonisolated by default
  • there should be no reference cycle between the parent and child objects
  • the init should not fallback to become async
  • we should be able to expose the access to the nested @MainActor protected object through non-async parent actor members when those are also marked as @MainActor

Let's walk the experiment process together:


The first solution that I came up with involved using lazy.

actor A {
  actor _B {
    nonisolated unowned let a: A
    init(a: A) {
      self.a = a
    }
  }

  lazy var _b = _B(a: self)
}

If on the other hand we would try to construct the A actor through its init, we will have to delay the initialization and then actor isolation would kick in. That won't work and require an async init.

Let's convert B into a class and wrap it with @MainActor.

actor A {
  @MainActor
  class _B {
    nonisolated unowned let a: A
    nonisolated init(a: A) { 
      self.a = a
    }
  }

  lazy var _b = _B(a: self)
}

Additionally _B's init had to become nonisolated in order to avoid the hop to the main thread to receive MainActor protection during the initialization.

So far so good. Now we want to extend _B with some properties which also would be protected by the MainActor. After that is done, A should expose access to those properties via MainActors protected computed properties.

actor A {
  @MainActor
  class _B {
    nonisolated unowned let a: A

    var string: String {
      "swift"
    }

    nonisolated init(a: A) {
      self.a = a
    }
  }

  lazy var _b = _B(a: self)

  @MainActor
  var string: String {
    _b.string // error: Actor-isolated property '_b' can not be referenced from the main actor
  }
}

This results into the first of our issues. _b cannot be accessed safely as it's protected by A. The solution for this would be almost trivial. We could make it nonisolated. However that is theoretically illegal for mutable stored properties on an actor and we will run into a new error:

actor X {
  actor Y {}
  nonisolated var y = Y() // error: 'nonisolated' can not be applied to stored properties
}

Interestingly enough, the compiler permits two other options and compiles the program without an error.

Option A)

actor A {
  ...

  @MainActor
  class _B {
    ...

    var string: String {
      "swift"
    }

    var number: Int = 42
  }

  // surprisingly no error! 👀
  nonisolated lazy var _b = _B(a: self)

  @MainActor
  var string: String {
    _b.string
  }

  @MainActor
  var number: Int {
    get {
      _b.number
    }
    set {
      _b.number = newValue
    }
  }
}

Option B)

actor A {
  ...

  // My assumption: this is theoretically illegal, or will become illegal in Swift 6 
  @MainActor
  lazy var _b = _B(a: self)
}

Option B is interesting, but as I already noted in the comment, it's likely to become illegal, unless I misunderstood the direction we're heading.

Option A smells like an actual bug. I could file a bug report if needed.

So all in all, there are currently two possible solutions to the problem which likely will become illegal in the future. However I would like to keep this ability to construct a cross referencing base actor with a hidden @MainActor protected class object, without the need to force an async init.

The only solution that comes to my mind would be: nonisolated lazy let.

  1. It will make _b an immutable reference.
  2. _b is implicitly Sendable as it's protected by MainActor.
  3. _b can remain hidden from the outside world.
  4. _b can capture a reference to self during its delayed nonisolated initialization.
  5. Since _b would be nonisolated we can expose MainActor protected on A without an issue.

By reversing the polarities, we can do interesting things with A. We could conform it to ObservableObject and pipe that through the hidden _B class. The type becomes much more flexible as it can be constructed from anywhere, it can be safely used in UI and it's not restricted to always be awaited, unless we would access members that it actually protects. That way we can feed that type easily from the background which may not even need to update the UI object.

The "usual" approach however requires that we construct a UI object first, that would then contain our actor object and we would need to expose it if we wanted to pipe something through it. That isn't always ideal. One could decouple those two objects, but that would force the delayed initialization of the UI object to be async as it would need to read and sync states with the actor object.

The other way around is a bit more straightforward:

@MainActor
class AA {
  actor BB {
    nonisolated unowned let aa: AA

    @MainActor
    var string: String {
      aa.string
    }

    @MainActor
    var number: Int {
      get {
        aa.number
      }
      set {
        aa.number = newValue
      }
    }

    init(aa: AA) {
      self.aa = aa
    }
  }

  // we cannot make this non-isolated
  // it's semi-okay this way around though
  var bb: BB! = nil

  var string: String {
    "swift"
  }

  var number: Int = 42 {
    didSet {
      print(number)
    }
  }

  init() {
    // delayed init
    self.bb = BB(aa: self)
  }
}
1 Like

Okay I need to provide a small update. I think I'm all confused about the current set of rules we have. So global actor protection on stored properties does not go anywhere it seems. It's only the default value on the RHS that will be treated as nonisolated. That's great for my case above as the class' init is already nonisolated anyways.

That makes Option B from above actually the legal and correct solution here.

However, the new question is, when does lazy assign the newly initialized object to the actor?
The A actor cannot access _b without an await and hopping to the MainActor, nor can anyone else do that. _B is guaranteed to be initialized on the MainActor and therefore it's init isn't really required to be nonisolated.

At a later point the new object is retuned, but it also must be assigned to the actor object.

  • What exactly happens here?
  • Is that even safe?

Having a lazy let would still be very beneficial in order to guarantee immutability of the reference post initialization.

1 Like

cc @kavon or @Douglas_Gregor. Could any of you possibly provide feedback on this topic. If not on lazy let specifically, but at least on a lazy stored property protected by a global actor, but being hosted on an actor object.

It seems there's not a single unit test in Swift repo that covers a similar case, so I'm a bit confused what exactly is happening here. Could that be racy? Is there a chained actor isolation at some point? If I access myActorObject.computedMainActorProperty from the main thread, where computedMainActorProperty would lazily kick in the initialization of the object backing object, how is it then assigned back to the myActorObject?

I'm struggling to follow your exposition here.

There's no such thing as a lazy let because the storage has to be mutable. The first read of the value triggers a mutation to fill in the value. If an assignment to the lazy var happens before a read, then its default expression is never run.

A lazy var shouldn't need to be affected by the changes proposed in SE-327, since a lazy var's default-value expression doesn't get implicitly run in the init. It can keep its isolation.

1 Like

I understand that there must be some form of mutation. I personally view lazy var as delayed mutable where there‘s no equivalent for a delayed immutable such as lazy let could be. Since the value would always have a known default value, all the syntactic sugar lazy let would need to do, is to lock the mutability after the first mutation and not permit any further mutation.

Now that‘s is something I didn‘t even know is possible. :exploding_head: However this shouldn‘t effect a hypothetical lazy let as it will not permit that option and only delay the initial use of the default value for only one single mutating.


Thank you for confirming this. I eventually came to the same conclusion after thinking about it for quite a while.


I‘m still confused what the effect of @SomeGlobalActor lazy var as stored property of a different actor is.

I can mentally follow that the initial access and initialization will happen on SomeGlobalActor.

  • However, how is it safely assigned back to the actor object it is supposed to live on without entering that actors isolation and requiring an await for that? It‘s technically mutating a stored property of the parent actor object.
  • Could the actor not deallocate under our nose before the lazy property gets assigned to it? Because we‘re technically possibly on two different contexts here.
  • Or does lazy has some magical strong reference to the object for the purpose of the value assignment?

This configuration bends my mind.

The stored property is isolated to the global actor, not the actor instance it’s stored on. All accesses to it (outside of deinit, possibly) will require that global actor isolation.

Isolation is not what stops actor objects from being deallocated; that’s just plain old reference-counting. Whoever is accessing the lazy property must have a strong reference to the actor.

1 Like

Thank you, that patches this part of the puzzle I have.

Okay, I understand the isolation part, but the actor object has also to eventually get the reference to the class assigned to it. And since lazy var performs a mutation on the actor object, shouldn‘t that be done on behalf of the actor‘s isolation instead of the global actor!?

  1. The global actor is used to access the reference.
  2. Since it‘s lazy, it‘s also initialized on the global actor itself.
  3. Now when the global actor initialized the reference to class object, how is that assigned back to the actor object, and on during which isolation and also why?

(3) Is basically what I‘m failing to understand here.

No. The basic rule — applicable to all kinds of actors and all kinds of storage — is that actor-isolated storage can only be accessed from that actor. Stored properties on an actor instance are isolated to the actor by default, but that's just a default and is not somehow required for overall correctness. If you make a stored property on an actor instance and isolate it to a global actor, you've effectively made a little outpost of the global actor within the allocation of the actor instance. That actor has no special privileges to that storage and cannot touch it, so there's no reason to synchronize any changes to it with the actor.

2 Likes

Just to clarify if I understood you correctly. So in other words, the class object lives on the global actor, but it‘s lifetime is bound to the lifetime of the actor object (ignoring the fact that it could be captured elsewhere), or better the lifetime is extended by the lifetime of the actor object?! Is that more or less correct, or at least as a mental model?

Ah, probably the easiest way to think about it would be that this kinda is like a stored property on the MainActor via an extension which only that particular actor object knows about, and the lifetime of that class object is extended by the lifetime of the actor object (unless captured elsewhere post creation). :thinking: I know it‘s not exactly an extension, but this somewhat resembles a similar topic we had in the past that could allow something similar.

// non valid pseudo swift code
extension MainActor from /* our actor object */ {
  lazy var property = defaultValue 
}

The lifetime of any particular reference to a class object is determined by where the reference is stored, which in your example happens to be in a stored property of another object, which happens to be an actor object. This is all true independent of isolation and shouldn't be particularly mind-bending.

I personally do not think of an actor as a specific object, although I can certainly see how someone might reasonably develop that intuition. An actor is (1) some ability to run code in isolation coupled with (2) a collection of storage which is isolated to the actor. In theory, that storage can be stored anywhere, and that theory is reality for global actors. Actor instances currently can only directly isolate the properties of a single object instance, but they don't have to isolate all of the properties of that instance, and we may someday add the ability for different actor instances to share an isolation domain.

4 Likes

Thank you that cleared a lot confusion for me. So basically the storage that global actors protect can also appear within another actor, hence anywhere. Or at least somewhere nearby which the other actor has a reference to.


Any possible feedback on lazy let as like a mutating get only stored property?

Custom actor executors will add an interesting twist to this. It's an immediate consequence of the feature that actors will be able to share a serial executor, which as far as the dynamic runtime is concerned means they share isolation. However, as long as the compiler doesn't understand that relationship statically, it will continue to enforce isolation between them, and you'll even have to await when calling between them (but it won't ever dynamically suspend). So whether or not two actors share a serial executor will not affect source and should just have performance implications. If we then add a way to make actor instances statically share isolation, that would allow awaits to be dropped, Sendable checks to be avoided, etc., basically making them the same isolation domain in all phases. But that's not an immediate consequence of adding custom actor executors.

4 Likes

I think there are a couple restrictions we've currently got on lets which cause unnecessary awkwardness with concurrency. The main one is weak let, which there's no reason not to support — I've said this elsewhere, but it's a misunderstanding to think of weak references as being mutable, rather than the reference itself needing to be continually tested. lazy let could also be allowed, but I'm not sure it changes anything for isolation unless we actually make it use a locking sequence like globals do, which seems a little subtle.

4 Likes

A lazy let that is locking like a static let would be a great support for the case I've described here: Why `nonisolated lazy var` is allowed on actor?