Hey Folks,
I’m in the middle of modernizing an older AVFoundation-based QR/Barcode scanner and pulling it into SwiftUI using Swift 6.2 concurrency, and I’ve hit a wall that I’m hoping someone can help me understand.
While researching how to move my AVFoundation pipeline into SwiftUI, I came across this great example that uses UIViewControllerRepresentable, Vision, and AVCaptureVideoDataOutput for real-time scanning:
(Reference: https://www.createwithswift.com/reading-qr-codes-and-barcodes-with-the-vision-framework/)
The example is extremely close to what I need, but it no longer compiles or runs under Swift 6.2 because AVFoundation is much stricter about threading. Specifically, I get this runtime warning/error:
Thread Performance Checker: -[AVCaptureSession startRunning] should be called from background thread.
Calling it on the main thread can lead to UI unresponsiveness.
This is the same issue I ran into while trying to update my company’s actual QR code scanner. The core problem seems to be:
How do you correctly integrate an AVCaptureSession into SwiftUI using Swift 6.2 concurrency without hitting data-race warnings or the startRunning() threading violation?
What I think I understand
After digging through related discussions from this forum.swift, like this one:
https://forums.swift.org/t/avcapturesession-and-concurrency/72681
…it seems clear that the “modern Swift” solution is to:
-
Move all AVFoundation session setup (inputs/outputs/session configuration)
-
And all start/stop operations
…into an actor, ideally with a custom executor, so that AVFoundation work runs on a dedicated serial queue but is still Swift-concurrency-safe.
The problem is:
I don’t know how to correctly implement the actor + custom executor pattern in a way that plays nicely with SwiftUI and UIViewControllerRepresentable.
What I’m looking for
If anyone can provide:
-
A minimal example actor wrapping AVCaptureSession (with custom executor)
-
Guidance on how to call it from SwiftUI without tripping thread-checker warnings
-
Or a pattern you use in production for AVFoundation + SwiftUI + Swift 6.2 concurrency
(especially for camera capture + Vision processing)
…I’d really appreciate it.
Sample Code
Here is the makeUIViewController I’m trying to modernize (shortened for readability see above URL for full code):
func makeUIViewController(context: Context) -> UIViewController {
let viewController = UIViewController()
guard let videoCaptureDevice = AVCaptureDevice.default(for: .video),
let videoInput = try? AVCaptureDeviceInput(device: videoCaptureDevice),
captureSession.canAddInput(videoInput) else { return viewController }
captureSession.addInput(videoInput)
let videoOutput = AVCaptureVideoDataOutput()
if captureSession.canAddOutput(videoOutput) {
videoOutput.setSampleBufferDelegate(context.coordinator, queue: DispatchQueue(label: "videoQueue"))
captureSession.addOutput(videoOutput)
}
let previewLayer = AVCaptureVideoPreviewLayer(session: captureSession)
previewLayer.frame = viewController.view.bounds
previewLayer.videoGravity = .resizeAspectFill
viewController.view.layer.addSublayer(previewLayer)
captureSession.startRunning() // <- triggers thread checker warning in Swift 6.2
return viewController
}
There’s also a Coordinator sending Vision results back to SwiftUI, and VideoDataOutput delegating to a background queue.
Everything works perfectly in older Swift versions — but Swift 6.2’s stricter concurrency model makes this no longer viable.
Any guidance, examples, or explanations of how to properly architect this under Swift 6.2 would be hugely helpful. Thanks in advance! ![]()