--- uid: xrhands-gesture-design --- # Gesture design **What is a gesture?** In everyday life, gestures are defined as the movement, position or shape of your hands that convey a meaning, emotion, intention etc. In mixed reality, sensors on headsets have the ability to detect the motion, position and rotation of your hands. This detected data can then be used to configure in the system as an intent. For example, a hand making the shape of “grabbing” an object action can be used to actually grab virtual objects. Currently, Unity offers recognition of simple but frequently used gestures like pinch, poke, grab/grip and point through the Meta Aim feature but now you can create your own custom gestures using our authoring tool. ## Benefits of creating custom gestures **Authenticity and Immersion** Adding custom gestures helps you create input methods that feel true to your storyline and context adding to the immersiveness. For example, if you’re creating an experience with a Hunger Games theme, you can create the district 12 salute as your own way to say hello to fellow players. **Controller-less interactions** Custom gesture can help provide an additional way for the user to do tasks without requiring a controller or voice commands like: * Capturing a photo * Moving or Teleporting in VR * Triggering social emoticons in multi-user experiences **Accessibility** Based on your audience, you can create custom gestures to make interactions feel more comfortable. For example, an experience targeting a new-to-XR demographic can incorporate familiar yet non-standard XR gestures like a swipe. ## Challenges of using custom gestures **Learnability of complex gestures** Some new hand gestures can be hard to teach without adequate guidance like visual cues, tooltips etc. Refer to the design considerations for tips about solving such challenges. **Trackability of gestures** Using the current technology, the visibility of joints and fingers by the tracking sensors determines how well it is recognized, which means that if the system can’t see the fingers, that might not guarantee the tracking of those fingers. For example, if you do the peace sign facing towards the face, the tracking is very good. If the peace sign faces away from the face, the curled fingers are not seen and thereby the tracking is compromised. Use the gesture debugging tool to see how your authored gesture is performing. **Difficult to do gestures** Certain gestures like the Spock sign are near to impossible to perform for some people due to natural variations in the hand skeleton or injuries and impairments. Be sure to test the custom gesture to accommodate most people from your target audience. **Hard to remember gestures** Certain gestures can be hard to recall especially when used sparingly. What you can do: Create a gesture that already has an association with the intended action, for example, using 2 hand frame for capture or add easy to access tooltips to remind the users midway. **Conflicting gestures** Most commonly used hand gestures are poke, pinch and grab. If your custom gesture has a hand shape that is tracked as very similar to either of these, there will be false positives and accidental triggers. What you can do: Use the gesture debugging tool to compare the tracked values with your configured target and tolerance and test the same gesture with diverse users from your target audience. **Hand fatigue** Some gestures require more effort than others. If your experience relies solely on hand tracking, be sure to use gestures that expend more energy only sparingly. **Heuristic based gestures** Our custom gesture model differs from the commonly provided out-of-box gestures like a poke and pinch, which are usually trained using a wide variety of hand skeletons. Instead, our model relies on the target and tolerance values that you configure for recognition. However, this means that some gestures may not perform as well if you don't account for different hand skeleton constraints during their authoring. ## Considerations for designing custom gestures **Learnability of complex gestures** Some new hand gestures can be hard to teach without adequate guidance. You can improve the user's ability to recall a gesture with the following: * **Visual cues**: Use a visual cue like an illustrated diagram or an animated 3d hand to clearly explain how to do the gesture. Be sure to show the hand at the handle that clearly shows the important finger states like curl, spread of fingers etc. * **Tooltips**: Add a way for users to quickly replay the visual cues to refresh their memory. * **Use recognizable gestures**: When possible, use gestures from everyday life for similar interactions like a thumbs up, thumbs down for yes and no actions or a wave for opening a chat window * **Design easy to recall rules**: Design a set of rules with a simple, easy to remember logic to categorize different types of gestures. For example, one such rule might be that any gesture that involves the wrist pointing up is a menu gesture:
Wrist facing up and pinch for menu gestures | |
![]() | ![]() |
Thumb and Middle finger pinch spawn the app eye calibration menu | Thumb and Index finger pinch spawn the app main menu |