Main actor-isolated property 'handPoseRequest' can not be referenced from a nonisolated context

I'm new to swift and I make Live camera feed and hand track with vision

This is my code

import SwiftUI
import AVFoundation
import Vision

@MainActor
class Camera: NSObject, ObservableObject {
    // MARK: - Properties
    private var captureSession: AVCaptureSession?
    private let videoOutput = AVCaptureVideoDataOutput()
    
    @Published var previewLayer: AVCaptureVideoPreviewLayer?
    private let sessionQueue = DispatchQueue(label: "CameraSessionQueue")

    private let handPoseRequest: VNDetectHumanHandPoseRequest = {
        let request = VNDetectHumanHandPoseRequest()
        request.maximumHandCount = 1
        return request
    }()
    
    override init() {
        super.init()
        Task {
            await configureCaptureSession()
        }
    }
    
    // MARK: - Capture Session Configuration
    private func configureCaptureSession() async {
        // Local capture session to avoid cross-thread isolation issues
        let localCaptureSession = AVCaptureSession()
        localCaptureSession.sessionPreset = .high
        
        // Configure video input
        guard let videoDevice = AVCaptureDevice.default(.builtInWideAngleCamera, for: .video, position: .back) else {
            print("Error: Unable to access the camera")
            return
        }
        
        do {
            let videoInput = try AVCaptureDeviceInput(device: videoDevice)
            guard localCaptureSession.canAddInput(videoInput) else {
                print("Error: Unable to add video input")
                return
            }
            localCaptureSession.addInput(videoInput)
        } catch {
            print("Error: \(error.localizedDescription)")
            return
        }
        
        // Configure video output
        videoOutput.setSampleBufferDelegate(self, queue: DispatchQueue(label: "VideoOutputQueue"))
        guard localCaptureSession.canAddOutput(videoOutput) else {
            print("Error: Unable to add video output")
            return
        }
        localCaptureSession.addOutput(videoOutput)
        
        // Assign the configured session to the main actor property
        await MainActor.run {
            self.captureSession = localCaptureSession
            self.previewLayer = AVCaptureVideoPreviewLayer(session: localCaptureSession)
            self.previewLayer?.videoGravity = .resizeAspectFill
        }
    }
    
    // MARK: - Public Methods
    func startSession() {
        sessionQueue.async { [weak self] in
            Task {
                await self?.startCaptureSession()
            }
        }
    }
    
    func stopSession() {
        sessionQueue.async { [weak self] in
            Task {
                await self?.stopCaptureSession()
            }
        }
    }
    
    private func startCaptureSession() async {
        await MainActor.run {
            captureSession?.startRunning()
        }
    }
    
    private func stopCaptureSession() async {
        await MainActor.run {
            captureSession?.stopRunning()
        }
    }
    
    var pointsProcessorHandler: (([CGPoint]) -> Void)?
    
    func processPoints(_ fingerTips: [CGPoint]) {
        guard let previewLayer = previewLayer else { return }
        
        let convertedPoints = fingerTips.map {
            previewLayer.layerPointConverted(fromCaptureDevicePoint: $0)
        }
        pointsProcessorHandler?(convertedPoints)
    }
}

// MARK: - AVCaptureVideoDataOutputSampleBufferDelegate
extension Camera: AVCaptureVideoDataOutputSampleBufferDelegate {
    nonisolated func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }
        
        let requestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer, options: [:])
        
        do {
            try requestHandler.perform([handPoseRequest])
            
            if let results = handPoseRequest.results?.first {
                let handLandmarks = try results.recognizedPoints(.all)
                
                let fingerTipKeys: [VNHumanHandPoseObservation.JointName] = [
                    .thumbTip, .indexTip, .middleTip, .ringTip, .littleTip
                ]
                
                // Map the detected points to an array of CGPoints
                let fingerTips = fingerTipKeys.compactMap { handLandmarks[$0]?.location }
                
                Task { @MainActor in
                    self.processPoints(fingerTips)
                }
            }
        } catch {
            print("Error performing hand pose request: \(error.localizedDescription)")
        }
    }
}

and error is

Main actor-isolated property 'handPoseRequest' can not be referenced from a nonisolated context

Is there any solution to fix it?

Thank you.

The only actor-isolated property that you are accessing in this nonisolated function is VNDetectHumanHandPoseRequest. You can retire that property and move that code into your nonisolated function, and you’re done:

// MARK: - AVCaptureVideoDataOutputSampleBufferDelegate

extension Camera: AVCaptureVideoDataOutputSampleBufferDelegate {
    nonisolated func captureOutput(_ output: AVCaptureOutput, didOutput sampleBuffer: CMSampleBuffer, from connection: AVCaptureConnection) {
        guard let pixelBuffer = CMSampleBufferGetImageBuffer(sampleBuffer) else { return }

        let requestHandler = VNImageRequestHandler(cvPixelBuffer: pixelBuffer)
        let handPoseRequest = VNDetectHumanHandPoseRequest()
        handPoseRequest.maximumHandCount = 1

        do {
            try requestHandler.perform([handPoseRequest])

            if let results = handPoseRequest.results?.first {
                let handLandmarks = try results.recognizedPoints(.all)

                let fingerTipKeys: [VNHumanHandPoseObservation.JointName] = [
                    .thumbTip, .indexTip, .middleTip, .ringTip, .littleTip
                ]

                // Map the detected points to an array of CGPoints
                let fingerTips = fingerTipKeys.compactMap { handLandmarks[$0]?.location }

                Task { @MainActor in
                    self.processPoints(fingerTips)
                }
            }
        } catch {
            print("Error performing hand pose request: \(error.localizedDescription)")
        }
    }
}

The creation of this request object is exceptionally quick, so using a property for it is of little utility.

I'm not too familiar with this API, but I'd have a closer look at your use of sessionQueue here. Maybe some code is missing, but it looks like it is not offering any additional functionality.

But, about your original question, it may be worth looking into a preconcurrency conformance. That can often (but not always!) be a better option than a non-isolated function.

https://www.swift.org/migration/documentation/swift-6-concurrency-migration-guide/commonproblems#Preconcurrency-Conformance

The sessionQueue is very important, to keep capture buffer related tasks off the main thread.

It’s possible to contort ourselves with a custom global actor whose custom “unowned executor” uses the same queue as sessionQueue, and then write a rendition of assumeIsolated akin to MainActor’s rendition. I can share an example, but this all a lot of work to access a single property that really doesn’t need actor-isolation at all.

1 Like

I have no doubt this is true! But I was talking only about the code that was posted. Specifically this:

func startSession() {
        sessionQueue.async { [weak self] in
            Task {
                await self?.startCaptureSession()
            }
        }
    }

...

private func startCaptureSession() async {
        await MainActor.run {
            captureSession?.startRunning()
        }
    }

These two constructs could very well be semantically important, but the MainActor.run looks unnecessary and the queue + Task usage here is possibly not doing what is expected.

1 Like

@mattie – Ah, sorry for misunderstanding your observation. I actually shared your reaction when I first was playing around with Apple’s sample.

I think the original author’s use of this convoluted pattern was a conscious decision. The app from which this was taken has the notion of calling suspend and resume of the sessionQueue; so, this isn’t quite the Rube Goldberg implementation that it appears to be at first glance. The idea is “defer this in case the queue is suspended.” That having been said, the snippet in the question above excluded the suspend/resume of sessionQueue, so out of context, it looks exceptionally weird. But I think the original author’s choice was a deliberate one, and not quite as obtuse as it appears at first glance.

You continue:

Again, like you, when first when looking at this, my reaction was “why in the world would a method already isolated to the main actor do this?!”

This is supposition, but I wonder if the original author was attempting a variation of the DispatchQueue.main.async {…} pattern from the main queue, a kludgy way of saying “here is some work I do not want to start immediately, but to add to the end of the queue.” We see this pattern in legacy GCD code bases (but IMHO, it frequently is also code smell).

But if that was the original author’s intent, I think the situation is even worse: In Swift concurrency (a) actors are not queues and do not always do things in a FIFO manner; and (b) actors are reentrant, so if there are any suspension points, this code might run sooner than the author intended. So the brittle GCD pattern is even worse in Swift concurrency. (Personally, when I need this sort of “honor the progression of state management” like they’re trying to do here, I reach for async channels or the like.)

In short, I share your observations/reservations on the above, but suspect that the original author’s seemingly convoluted choices may have been deliberate.

1 Like