The problem is that incoming requests need to be rate-limited. I'd like to delay execution of the body of this method when there are too many outstanding requests, and return a future of some kind to the caller, so it can maintain its async/await flow.
Is there a way to do this using structured concurrency?
There is a processing loop involved that can dequeue the next waiting request when a download completes. I tried a continuation, but the method body needs to be async, and it seems you can't use continuations with async code.
The body that you want to pass to withCheckedContinuation is an async function? I wonder if you can break it down further to get around that. I think we're at the point where seeing some of your actual code would be helpful.
You can certainly just have an async function that returns a future. But what do you expect callers to do with this future? Would it be better to offer the caller an opportunity to provide a callback to handle exceptional but non-fatal conditions, and then the API itself only returns/throws when the whole operation is complete?
I ended up going a different route for my requirement, but I am interested in learning more about Swift Concurrency. I think I understand how to return a future: that's just returning a Task, which can be await'ed at the caller's convenience, correct?
What I am missing is a way to create a Task, or task-like thing, that doesn't execute right away. That would have solved my problem as I originally conceived it. Does such a mechanism exist?
You could create an async closure and pass that, which can later be run in a Task. I suppose you can extend Combine’s Future type to accept such a closure which runs inside a Task. For example:
extension Future where Failure == Error {
convenience init(_ closure: @escaping () async throws -> Output) {
self.init { promise in
Task {
do {
let value = try await closure()
promise(.success(value))
} catch {
promise(.failure(error))
}
}
}
}
}
Now you can wrap your work in a closure, create a Future using it and pass that. Then later its value can be awaited.
let future = Future(asyncClosure)
// Pass the future to another function or whatever
let value = try await future.value
I’m not sure I understand your goal very well, but if you just want to process requests at some specific rate, then you could create a downloader async sequence and zip that to a timer. I haven’t really thought this through; I’m just throwing that out there as a possibility. From my understanding, if no download requests are available, the downloader would await on producing its next element and nothing would happen. As for the timer, it would produce values at a fixed rate, hence capping off how many requests can be processed and thus zipped at a given time. The underlying concurrency mechanism is Task.select (essentially a task group that returns its first result to complete).
The requirement is to not process the incoming request at all while X (currently defined as 5 in the app) are currently being processed. Without concurrency, I'd use an OperationQueue here.
You can accomplish this with a TaskGroup and manually enqueuing child tasks. I have an example here that's talking about working around a performance issue, but would also do what you want here.
I think a simple AsyncQueue actor that manages an array of Operations (@Sendable () async -> Void) could be written that accomplishes what you need. I think this is similar to what OperationQueue offers, but utilizes Swift concurrency: