Gesture-Based Ιnteraction: Visual Gesture Mapping
Chapter
Accepted version
View/ Open
Date
2020Metadata
Show full item recordCollections
- Institutt for design [1201]
- Publikasjoner fra CRIStin - NTNU [39152]
Original version
10.1007/978-3-030-49062-1_7Abstract
Gesture-based interaction allows for interacting with computers, machines and robots in an intuitive way without direct physical contact. The challenge is that there are no agreed-upon interaction patterns for gesture-based interaction in VR and AR environments. In this paper we have developed a set of 10 gestures and corresponding visualizations in the following categories of gestures: (1) directional movement, (2) flow control, (3) spatial orientation, (4) multifunctional gestures, and (5) tactile gestures. One of the multifunctional gestures and its visualization were selected for usability testing (N = 18) in a 3D car track simulator. We found that the visualization made it faster and easier to understand the interaction made the interaction more precise. Further, we learned that the visualization worked well as guidance to learn to control the car but could be removed after a while as the user had learned the interaction. By combining gestures from the library, gesture-based interaction can be used to control advanced machines, robots and drones in an intuitive and non-strenuous way.