Concurrent data access - fast main thread reads (stale OK) & background writes

I'm working on a Swift application where I need to manage a piece of data (e.g., a CGRect representing a rectangle's position) that can be modified on a background thread/task, while concurrently being read on the main thread for UI updates or hit-testing.

My primary goals are:

  1. Safe concurrent access : Avoid data races.

  2. Fast reads on the main thread : Reading should be synchronised and as non-blocking as possible, and I can accept slightly stale data. Also, you may assume that, in my case, all writing operations are always performed by the same actor.

  3. Eventually consistent : The main thread reads should eventually reflect the latest updates from the background.

Currently, I'm using a traditional NSLock approach within a regular class, which works but has some drawbacks:

class RectangleManager {
    private var myRectangle: CGRect = .zero
    private let lock = NSLock()

    var safeRectangle: CGRect {
        get {
            lock.lock()
            defer { lock.unlock() }
            return myRectangle
        }
        set {
            lock.lock()
            defer { lock.unlock() }
            myRectangle = newValue
        }
    }

    // Example usage:
    // Background task updates safeRectangle
    // Main thread reads safeRectangle for UI or touch events
}

// Example of how I might use it:
let manager = RectangleManager()

// Simulate background update
DispatchQueue.global().async {
    manager.safeRectangle = CGRect(x: 10, y: 10, width: 100, height: 100)
}

// Simulate main thread read
DispatchQueue.main.async {
    let rect = manager.safeRectangle // This locks, which I want to avoid if possible for reads
    print("Main thread read: \(rect)")
}

Concerns with the NSLock approach:

  • Verbosity : Requires boilerplate for get and set with locking, as I may have many similar properties.

  • Main thread locking : The get operation on the main thread acquires a lock. If the lock is contended by a write operation, the main thread can be blocked, which is undesirable for UI responsiveness.

My Question / Desired Pattern:

I'm exploring modern Swift Concurrency approaches to replace the lock-based system. What is an idiomatic pattern, potentially using Actors, to achieve safe background writes while allowing the main thread to perform very fast, non-blocking reads of this data, even if the data read is slightly stale?

Any examples or pointers to best practices would be greatly appreciated!

Thanks!

Have you considered using an AsyncStream that connects a producer to a consumer (the main thread)?

Example
func runTest () async throws {
    print ()
    let sp = Test.Stream ()
    
    Task {@TestActor1 in
        try await Test.produce(sp: sp)
    }
    
    Task {@TestActor2 in
        try await Test.consume(sp: sp)
    }
    hibernate (seconds: 5)
}

import Dispatch

func hibernate (seconds s: Double) {
    let sem = DispatchSemaphore (value: 0)
    _ = sem.wait (timeout: .now() + .milliseconds(Int (1000 * s)))
}

@globalActor
actor TestActor1 {
    static let shared = TestActor1 ()
}

@globalActor
actor TestActor2 {
    static let shared = TestActor2 ()
}

struct Test {
    struct Size {
        let width : Int
        let height: Int
        
        static var zero: Size {
            Self (width: 0, height: 0)
        }
    }
    
    struct Rectangle {
        let x: Int
        let y: Int
        
        let size: Size
    }
    
    final class Stream: Sendable {
        let (stream, continuation) = AsyncStream <Rectangle>.makeStream()
    }
}

extension Test {
    static func produce (sp: Stream) async throws {
        let continuation = sp.continuation
        
        continuation.onTermination = { termination in
            print(#function, "onTermination", termination)
        }
        
        for _ in 0..<16 {
            // produce slowly
            continuation.yield (rectangle ())
            hibernate(seconds: 0.001)
        }
        continuation.finish()
    }
    
    static func rectangle () -> Rectangle {
        return Rectangle (x: Int.random(in: 0..<16), y: Int.random(in: 0..<16), size:.zero)
    }
}

extension Test {
    static func consume (sp: Stream) async throws {
        let stream = sp.stream
        
        for await rect in stream {
            print (#function, rect)
        }
    }
}

You may also find this useful: Proposal: SE-0433 - Synchronous Mutual Exclusion Lock

1 Like

I like @ibex10's approach as it also allows you to specify a buffer policy in case your background operation might overwhelm the consumer, but the way I understand your post that seems to be no issue, right?
If so, why not simply isolate your myRectangle property to the main actor? I'd probably do this:

class RectangleManager {

    @MainActor
    private(set) var myRectangle = CGRect.zero

    func updateFromAnyContext(_ rectangle: CGRect) async {
        @MainActor func updateHelper(_ rect: CGRect) {
            myRectangle = rect
        }
        await updateHelper(rectangle)
    }
}

If you want your updating method to be synchronous you can simply start a Task { @MainActor in ... } in it to set myRectangle. That then doesn't need the helper function to make the hop.

Note there is no way to make a setter for a calculated property asynchronous, so you need a function that does the job instead, but I think that's basically by design.
At least it's the current Swift idiom, I'd say. async functions always make any suspension points clear, and in this context here they are what allows the runtime to smoothly and cleanly hop from one isolation to another, including passing data such as the rectangle.

3 Likes

Thank you, @gero!

I am still learning this stuff; your example is enormously useful.

1 Like

Beware the pitfall here:

v.safeRectangle.size.width += 2

is not atomic as it appears to be.

you'd need something like:

v.lock.withLock {
    v.myRectangle.size.width += 2
}

Ditto when accessing several properties like these:

let a = v.safeRectangle1
let b = v.safeRectangle2

if atomicity / consistency of the two variables is important you'd need to lock around them both:

let (a, b) = v.lock.withLock {
    (v.myRectangle1, v.myRectangle2)
}

But then BG tasks will have to wait for the main task?

5 Likes

Well, don't they technically do exactly that with the lock, too? If you set the rectangle from the BG task while the main actor/thread is reading it the lock is lock()ed, so the underlying thread blocks until the main thread calls unlock() again.
I realize swift concurrency probably introduces a little overhead, but your example points to exactly the problems you run into when rolling your own synchronization. Also, the async setter method does not necessarily block the underlying thread of the BG task, it just suspends, leaving the thread available to perform other things if needed.

I think with regular isolated asynchronous methods you can favor one context (in this case the main actor) over BG tasks. In many cases that's what you want.

If your BG work exceeds the read operations you do on the main actor you run into other problems, but as said I think @ibex10's solution of an AsyncStream can nicely resolve that by using a fitting buffer strategy (most likely bufferingNewest, it's unavoidable to throw some results away then anyway). However, @SWNS wrote that there's more reads than writes, so I'd go the "classic" way here. :smiley:

1 Like