Coding 3D Gestures

I'm a total newbie to Swift and was wondering if it's possible to develop code to recognise 3D gestures (such as a head nod) via the onboard camera, or would that function have to be part of the operating system? You can use 3D gestures in IOS 13 (and earlier) if you enable Switch Control in Accessibility but this also starts sequentially highlighting items on the screen. Is there any way of tapping into this functionality, coding it directly in Swift or am I overestimating what Swift can do?
Thanks.

1 Like

The short answer is yes but, this question is not really about Swift, but, the Apple frameworks. The frameworks handle gesture recognition, and they can be accessed via Swift interfaces. See the Apple Developer documentation, and ask on forums like the Apple Developer forums, or StackOverFlow.

I should not have said "yes". You really need to look at the Apple frameworks documentation, or ask on another forum. This forum is about Swift the language, not the Apple infrastructure. Swift is just a language, like C, Objective-C, etc..

Many thanks Jonprescott, the important first thing is knowing where there functionality lies so that I can look in the right places so thanks for the pointer.