Hi everyone!
I'm currently working on a proposal for Google Summer of Code 2025 under the Swift organization, and I’d love to get feedback from the community — or hear if anyone would like to collaborate or mentor!
By the way, my name is Jan Steinhauer, and I’m currently pursuing my master’s in Human-Computer Interaction at the University of Würzburg. I’ve developed for visionOS and iOS apps using Swift, and I’m currently a working student at SAP as a VisionOS/iOS Developer.
Project size: 175 hours
Estimated difficulty: Intermediate
Recommended skills:
- Proficiency in Swift
- Familiarity with visionOS gesture and hand-tracking APIs
- Some experience with SwiftUI or visionOS UI concepts
- Interest in building reusable and well-structured Swift packages
Description:
While visionOS supports low-level gesture inputs like pinch, gaze, and tap, developers often need to manually compose these into more advanced gestures. This leads to repetitive logic and inconsistent behavior across apps.
SwiftGestureKit aims to solve this by providing a modular Swift package of reusable, high-level gesture recognizers tailored for visionOS. These gestures will be configurable with parameters such as minDuration and threshold, and easily attachable to views using an intuitive, event-based API.
The library will include at least six custom gesture types, such as:
- DragToScaleGesture() – drag to resize spatial elements
- PinchHoldGesture(minDuration: 0.5, threshold: 0.2) – triggers after a sustained pinch
- GazeSwipeGesture(direction: .left) – combines gaze detection with a directional swipe
- PinchAndTwistGesture() – a two-handed gesture for rotation and scaling
- RotateToRevealGesture(minAngle: .pi / 2) – activates when an object is rotated past a given angle, useful for revealing hidden content or flipping elements
- RotateAndPinchGesture(minRotation: .pi / 4, minScale: 1.2) – detects a rotation and pinch happening together, useful for object manipulation or zoom-and-rotate interactions
A sample visionOS app will demonstrate real-world interactions such as dragging to resize objects, activating menus with long pinches, and using gesture combos to trigger effects.
Expected Outcomes / Deliverables:
- A Swift package named
- At least six custom, configurable gesture recognizers
- A clean and consistent event-based API for SwiftUI and RealityView integration
- Sample visionOS app demonstrating gesture use cases
- Inline documentation, usage guides, and a test suite
What I’m Looking For:
- Feedback on the idea and its scope
- Suggestions for additional gesture types or use cases
- Anyone interested in co-developing or mentoring
Thanks so much — looking forward to hearing your thoughts!